https://github.com/unvercan/activation-function-comparison-pytorch
Comparison of common activation functions on MNIST dataset using PyTorch.
https://github.com/unvercan/activation-function-comparison-pytorch
activation-functions dataset deep-learning image-classification machine-learning matplotlib mnist neural-network numpy python pytorch relu sigmoid tanh
Last synced: 8 months ago
JSON representation
Comparison of common activation functions on MNIST dataset using PyTorch.
- Host: GitHub
- URL: https://github.com/unvercan/activation-function-comparison-pytorch
- Owner: unvercan
- License: mit
- Created: 2019-10-28T19:51:16.000Z (over 6 years ago)
- Default Branch: main
- Last Pushed: 2022-02-09T09:41:21.000Z (almost 4 years ago)
- Last Synced: 2025-06-13T19:03:57.694Z (8 months ago)
- Topics: activation-functions, dataset, deep-learning, image-classification, machine-learning, matplotlib, mnist, neural-network, numpy, python, pytorch, relu, sigmoid, tanh
- Language: Python
- Homepage:
- Size: 2.24 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# activation-functions-comparison-pytorch
Comparison of common activation functions on MNIST dataset using PyTorch.
## Activation functions:
- Relu
- Sigmoid
- Tanh
## Best result: Relu