https://github.com/parasdahal/deepnet
Educational deep learning library in plain Numpy.
https://github.com/parasdahal/deepnet
adagrad adam-optimizer batch-normalization cnn dropout nesterov-accelerated-sgd
Last synced: 3 months ago
JSON representation
Educational deep learning library in plain Numpy.
- Host: GitHub
- URL: https://github.com/parasdahal/deepnet
- Owner: parasdahal
- License: mit
- Created: 2017-06-03T10:25:02.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2022-06-21T21:13:01.000Z (over 3 years ago)
- Last Synced: 2024-08-08T23:23:56.476Z (about 1 year ago)
- Topics: adagrad, adam-optimizer, batch-normalization, cnn, dropout, nesterov-accelerated-sgd
- Language: Python
- Homepage: https://deepnotes.io/implementing-cnn
- Size: 40 KB
- Stars: 320
- Watchers: 17
- Forks: 83
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# deepnet
Implementations of CNNs, RNNs and cool new techniques in deep learning
Note: deepnet is a work in progress and things will be added gradually. It is not intended for production, use it to learn and study implementations of latest and greatest in deep learning.
## What does it have?
**Network Architecture**
1. Convolutional net
2. Feed forward net
3. Recurrent net (LSTM/GRU coming soon)
**Optimization Algorithms**
1. SGD
2. SGD with momentum
3. Nesterov Accelerated Gradient
4. Adagrad
5. RMSprop
6. Adam
**Regularization**
1. Dropout
2. L1 and L2 Regularization
**Cool Techniques**
1. BatchNorm
2. Xavier Weight Initialization
**Nonlinearities**
1. ReLU
2. Sigmoid
3. tanh
## Usage
1. ```virtualenv .env``` ; create a virtual environment
2. ```source .env/bin/activate``` ; activate the virtual environment
3. ```pip install -r requirements.txt``` ; Install dependencies
4. ```python run_cnn.py {mnist|cifar10}``` ; mnist for shallow cnn and cifar10 for deep cnn