Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/sameetasadullah/neural-network-implementation
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
https://github.com/sameetasadullah/neural-network-implementation
activation-functions adagrad adam-optimizer cross-entropy-loss gradient-descent hinge-loss jupyter-notebook leaky-relu loss-functions mean-squared-error neural-network optimizers pycharm pycharm-ide python python3 relu-activation rmsprop sigmoid-activation softmax-activation
Last synced: 26 days ago
JSON representation
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
- Host: GitHub
- URL: https://github.com/sameetasadullah/neural-network-implementation
- Owner: SameetAsadullah
- License: mit
- Created: 2022-08-15T19:24:57.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2022-08-15T19:29:19.000Z (about 2 years ago)
- Last Synced: 2024-09-28T11:04:02.836Z (about 1 month ago)
- Topics: activation-functions, adagrad, adam-optimizer, cross-entropy-loss, gradient-descent, hinge-loss, jupyter-notebook, leaky-relu, loss-functions, mean-squared-error, neural-network, optimizers, pycharm, pycharm-ide, python, python3, relu-activation, rmsprop, sigmoid-activation, softmax-activation
- Language: Jupyter Notebook
- Homepage:
- Size: 5.86 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Neural Network Implementation
### Description
`Neural Network` implemented with different `Activation Functions`, `Optimizers`, and `Loss Functions`.### Activation Functions
- Sigmoid
- Relu
- Leaky-Relu
- Softmax### Optimizers
- Gradient Descent
- AdaGrad
- RMSProp
- Adam### Loss Functions
- Cross-Entropy Loss
- Hinge-Loss
- Mean Squared Error (MSE)### Contributors
- [Sameet Asadullah](https://github.com/SameetAsadullah)
- [Aysha Noor](https://github.com/ayshanoorr)