Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/sameetasadullah/neural-network-implementation

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
https://github.com/sameetasadullah/neural-network-implementation

activation-functions adagrad adam-optimizer cross-entropy-loss gradient-descent hinge-loss jupyter-notebook leaky-relu loss-functions mean-squared-error neural-network optimizers pycharm pycharm-ide python python3 relu-activation rmsprop sigmoid-activation softmax-activation

Last synced: about 6 hours ago
JSON representation

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

Awesome Lists containing this project

README

        

Neural Network Implementation

### Description
`Neural Network` implemented with different `Activation Functions`, `Optimizers`, and `Loss Functions`.

### Activation Functions
- Sigmoid
- Relu
- Leaky-Relu
- Softmax

### Optimizers
- Gradient Descent
- AdaGrad
- RMSProp
- Adam

### Loss Functions
- Cross-Entropy Loss
- Hinge-Loss
- Mean Squared Error (MSE)

### Contributors
- [Sameet Asadullah](https://github.com/SameetAsadullah)
- [Aysha Noor](https://github.com/ayshanoorr)