Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/u66u/lotus
ML framework for simple neural networks with optimal defaults
https://github.com/u66u/lotus
ai artificial-intelligence cpp learning-resources machine-learning ml neural-network neural-networks open-source pytorch tensorflow tf torch transformers tutorial
Last synced: 27 days ago
JSON representation
ML framework for simple neural networks with optimal defaults
- Host: GitHub
- URL: https://github.com/u66u/lotus
- Owner: u66u
- Created: 2023-09-20T16:06:32.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-10-10T17:01:47.000Z (about 1 year ago)
- Last Synced: 2023-11-09T15:32:16.348Z (about 1 year ago)
- Topics: ai, artificial-intelligence, cpp, learning-resources, machine-learning, ml, neural-network, neural-networks, open-source, pytorch, tensorflow, tf, torch, transformers, tutorial
- Language: C++
- Homepage:
- Size: 62.5 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
set different ranges for different weght
bias inits, set default ranges for init functions
test memory
check iterators in layerImplement differen weight/bias init for xavier:
The original paper by Glorot and Bengio suggests a variance of 2/(n_in + n_out) where n_in is the number of inputs and n_out is the number of outputs of the neuron 365datascience.com.
Some sources, such as the deeplearning.ai notes, suggest a variance of 1/n_in deeplearning.ai.
Some other sources, such as machinelearningmastery.com, suggest a uniform distribution in the range -(sqrt(6)/sqrt(n_in + n_out)) to sqrt(6)/sqrt(n_in + n_out) machinelearningmastery.com.
print statements are triggered even if the ranges are defined for init methods