https://github.com/rudrodip/timmygrad
scalar value gradient descent optimizer
https://github.com/rudrodip/timmygrad
backpropagation gradient-descent multilayer-perceptron-network neural-network numpy python3 pytorch
Last synced: 7 months ago
JSON representation
scalar value gradient descent optimizer
- Host: GitHub
- URL: https://github.com/rudrodip/timmygrad
- Owner: rudrodip
- Created: 2024-05-01T11:51:01.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-05-02T15:08:57.000Z (over 1 year ago)
- Last Synced: 2025-03-15T08:55:28.027Z (7 months ago)
- Topics: backpropagation, gradient-descent, multilayer-perceptron-network, neural-network, numpy, python3, pytorch
- Language: Python
- Homepage:
- Size: 327 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Timmygrad
Timmygrad is a scalar value gradient descent optimizer for Python. It is designed to be simple and easy to use, with a focus on readability and understandability. It is not designed for performance, but rather for educational purposes.
![]()
this is timmy btw
Here is a simple Linear Regression example using Timmygrad:
```python
m = Value(0.0)
c = Value(0.0)alpha = 0.01 # learning rate
epochs = 200for epoch in range(epochs):
for x, y in zip(X, Y):
# forward pass
y_pred = m * x + c# compute loss
loss = (y - y_pred) ** 2# backward pass
loss.backward()# update weights
m.data -= alpha * m.grad
c.data -= alpha * c.grad# reset gradients
m.grad = 0
c.grad = 0
```
![]()