https://github.com/anibali/weightnorm
Unofficial Torch implementation of weight normalization
https://github.com/anibali/weightnorm
lua neural-network torch7
Last synced: 7 months ago
JSON representation
Unofficial Torch implementation of weight normalization
- Host: GitHub
- URL: https://github.com/anibali/weightnorm
- Owner: anibali
- License: mit
- Created: 2017-03-06T22:35:39.000Z (about 9 years ago)
- Default Branch: master
- Last Pushed: 2017-03-07T01:28:59.000Z (about 9 years ago)
- Last Synced: 2025-01-14T08:52:06.226Z (about 1 year ago)
- Topics: lua, neural-network, torch7
- Language: Lua
- Size: 10.7 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Weight normalization
This is an unofficial Torch implementation of ["Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks"](http://arxiv.org/abs/1602.07868)
by T. Salimans and D. P. Kingma.
Should work with any weighted layer, including `cudnn` versions.
## Usage
```lua
local wn = require('weightnorm')
-- Construct the neural network as usual, but use `wn()` to wrap weighted
-- layers that you want to be weight normalized.
local net = nn.Sequential()
net:add(wn(nn.Linear(32*32, 200)))
net:add(nn.ReLU())
net:add(wn(nn.Linear(200, 200)))
net:add(nn.ReLU())
net:add(wn(nn.Linear(200, 10)))
-- [Optional] Perform a data-driven initialization pass.
-- Only works in batch mode.
local batch_input = my_batch_input_loader_function()
net:set_init_pass(true)
net:forward(batch_input)
net:set_init_pass(false)
```
## Examples
The examples require [torchnet](https://github.com/torchnet/torchnet) to be
installed.
### MNIST MLP
1. Download MNIST: `bash examples/mnist/download_mnist.sh`
2. Train the network: `th examples/mnist/train_mlp_mnist.lua`
## Tests
Run the tests with the following command:
`th test/run_tests.lua`