https://github.com/sloganking/micrograd-rust
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
https://github.com/sloganking/micrograd-rust
autograd neural-network
Last synced: 3 months ago
JSON representation
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
- Host: GitHub
- URL: https://github.com/sloganking/micrograd-rust
- Owner: sloganking
- Created: 2024-03-21T23:54:24.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2024-03-30T05:22:07.000Z (over 1 year ago)
- Last Synced: 2025-03-22T14:43:38.144Z (7 months ago)
- Topics: autograd, neural-network
- Language: Rust
- Homepage:
- Size: 65.4 KB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# micrograd-rust
Building my own Neural networks in Rust. Inspired by Andrej Karpathy's [micrograd](https://github.com/karpathy/micrograd) and [related video](https://youtu.be/VMj-3S1tku0?si=0AJEx-81hEmTKzqf). The code was built piece by piece from the ground up by me. I also took inspiration from [rustygrad](https://github.com/Mathemmagician/rustygrad) when initially figuring out how to make the Rust borrow checker happy with a DAG.## Installation
- You must have [graphviz](https://graphviz.org/download/) installed for the graph pngs to be generated.## Examples
### A simple equation
```rust
let a = Value::from(3.0);
let b = Value::from(4.0);
let c = Value::from(5.0);
let d = (a - b) * c;d.backward();
graph::render_graph(&d).unwrap()
```
### A single neuron with 3 inputs
```rust
let inputs = vec![Value::from(1.0), Value::from(2.0), Value::from(3.0)];
let neuron = neural::Neuron::new(inputs.len().try_into().unwrap());
let out = neuron.forward(inputs);
out.backward();
graph::render_graph(&out, neuron.get_subgraph_tree().unwrap()).unwrap();
```
### A [4,4,1] layer MLP
```rust
let x_inputs = vec![Value::from(1.0), Value::from(2.0), Value::from(3.0)];
let y_target = Value::from(1.0);
let mlp = neural::MLP::new(x_inputs.len().try_into().unwrap(), vec![4, 4, 1]);
let preds = mlp.forward(x_inputs);
let pred = preds[0].clone();
let loss = (pred.clone() - y_target.clone()).pow(Value::from(2.0));
loss.backward();
graph::render_graph(&loss, mlp.get_subgraph_tree().unwrap()).unwrap();
```
