Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/banyc/neural_network

vanilla, simple, node-oriented, compositive, optimized, frameworkn't{torchn't, TFn't, candlen't}
https://github.com/banyc/neural_network

cnn computational-graph data-science from-scratch genetic-algorithm gyatt machine-learning mnist neural-network rust transformer

Last synced: 2 months ago
JSON representation

vanilla, simple, node-oriented, compositive, optimized, frameworkn't{torchn't, TFn't, candlen't}

Awesome Lists containing this project

README

        

# Neural Network

## MNIST

- steps:
1. Download MNIST dataset to:
- TRAIN_IMAGE: `local/mnist/train-images.idx3-ubyte`
- TRAIN_LABEL: `local/mnist/train-labels.idx1-ubyte`
- TEST_IMAGE: `local/mnist/t10k-images.idx3-ubyte`
- TEST_LABEL: `local/mnist/t10k-labels.idx1-ubyte`
1. Run:
```sh
cargo test --release -- --include-ignored --nocapture mnist::train
```
1. Inspect parameters at: `local/mnist/params.ron`

## Backpropagation

- distribution of addends of $\frac{\partial G}{\partial f_1}$:
![](img/backpropagation.svg)
- a part of the computation graph
- $h_i : \mathbb{R}^{m_i} \to \mathbb{R}$
- $f_j : \mathbb{R}^{n_j} \to \mathbb{R}$
- $h_i$ are the successors of $f_j$
- $G$ is the outmost function represented by the root node of the computation graph
- $w$ is the tunable parameters of $f_1$
- steps:
1. nodes $h_1, h_2$ calculate the addends respectively
1. nodes $h_1, h_2$ distribute the addends to $f_1, f_2$
1. node $f_1$ calculates $\frac{\partial G}{\partial f_1}$ from the received addends
1. node $f_1$ calculates $\frac{\partial G}{\partial w}$ using $\frac{\partial G}{\partial f_1}$
1. node $f_1$ updates $w$ using $\frac{\partial G}{\partial w}$

## References

- the repo on which mine is based -