https://github.com/adelsz/oxigrad
Toy autodifferentiation engine in Rust
https://github.com/adelsz/oxigrad
Last synced: 4 months ago
JSON representation
Toy autodifferentiation engine in Rust
- Host: GitHub
- URL: https://github.com/adelsz/oxigrad
- Owner: adelsz
- License: mit
- Created: 2024-02-01T22:22:50.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-02-01T22:25:55.000Z (over 1 year ago)
- Last Synced: 2024-11-24T17:34:14.475Z (7 months ago)
- Language: Rust
- Size: 7.42 MB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Oxigrad - toy automatic differentiation engine in Rust
This is a self-contained toy automatic differentiation engine written in Rust.
Built for learning purposes and partially inspired by [micrograd](https://github.com/karpathy/micrograd).
API is focused on ease of use and educational clarity, performance was not a priority.## Usage
```rust
#[test]
fn backprop_test() {
let a = Value::new(2.0);
let b = Value::new(1.0);
// z = (a * b + a) * (a * b + b) = a^2 * b^2 + a^2 * b + a * b^2 + a * b
let z = (&a * &b + &a) * (&a * &b + &b);
backprop(&z);
// dz/da = 2ab^2 + 2ab + b^2 + b = 2*2*1 + 2*2 + 1 + 1 = 10
assert_eq!(a.grad().get(), 10.0);
}
```## Basic neural network training
There is a basic NN training example in `src/neuron.rs` that also saves a training progress visualization in a GIF file.
![]()