https://github.com/kinoute/elyane
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
https://github.com/kinoute/elyane
adam-optimizer deep-learning dropout l2-regularization momentum neural-network oop python rmsprop softmax
Last synced: 21 days ago
JSON representation
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
- Host: GitHub
- URL: https://github.com/kinoute/elyane
- Owner: kinoute
- Created: 2019-06-25T16:59:55.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2019-07-08T07:25:24.000Z (almost 6 years ago)
- Last Synced: 2025-05-13T01:12:29.805Z (21 days ago)
- Topics: adam-optimizer, deep-learning, dropout, l2-regularization, momentum, neural-network, oop, python, rmsprop, softmax
- Language: Python
- Homepage:
- Size: 41.8 MB
- Stars: 4
- Watchers: 1
- Forks: 3
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# A Keras-Like Deep Neural Network Abstraction
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
## Installation
Just clone the repository to your computer to get started:
```sh
git clone [email protected]:kinoute/Elyane.git
cd Elyane
```The only dependencies to make this neural network work are "numpy" and "pdoc3". They can be installed directly from the main directory if you don't have them already:
```sh
pip3 install -r requirements.txt
```Three examples are available: `mnist.py`, `fashion_mnist.py` and `xor.py`.
## Features
### Batches
* Batch Gradient Descent
* Mini-batch gradient descent
* SGD### Layers
* Full-Connected Layer
* Softmax Layer
* Dropout Layer### Optimizers
* Momentum
* RMSprop
* Adam Optimization
* Amsgrad optimization### Losses
* Mean Squared Error (MSE)
* Mean Absolute Error (MAE)
* Cross Entropy
* Multi-Class Cross Entropy### Regularizers
* L1 Regularization
* L2 Regularization
* Dropout### Activation functions
* Sigmoid
* TanH
* Relu
* Leaky Relu
* Softmax### Tools
* One hot encoding
* Normalization of images