https://github.com/null-none/pyignite
Simple deep learning library
https://github.com/null-none/pyignite
deep-learning machine-learning ml numpy python
Last synced: about 1 month ago
JSON representation
Simple deep learning library
- Host: GitHub
- URL: https://github.com/null-none/pyignite
- Owner: null-none
- License: mit
- Created: 2025-01-01T09:03:07.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-01T09:23:20.000Z (about 1 year ago)
- Last Synced: 2025-01-01T10:29:35.071Z (about 1 year ago)
- Topics: deep-learning, machine-learning, ml, numpy, python
- Language: Python
- Homepage:
- Size: 5.86 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# pyIgnite
Simple deep learning library
### Install
```bash
pip install numpy
```
### Example
```python
import numpy as np
from src.loss import CrossEntropyLoss
from src.optimizer import SGD
from src.simple_net import SimpleNet
input = np.array(
[
[0.99197708, -0.77980023, -0.8391331, -0.41970686, 0.72636492],
[0.85901409, -0.22374584, -1.95850625, -0.81685145, 0.96359871],
[-0.42707937, -0.50053309, 0.34049477, 0.62106931, -0.76039365],
[0.34206742, 2.15131285, 0.80851759, 0.28673013, 0.84706839],
[-1.70231094, 0.36473216, 0.33631525, -0.92515589, -2.57602677],
]
)
target = np.array([[1, 0]])
loss_fn = CrossEntropyLoss()
model = SimpleNet()
optim = SGD(model.parameters(), learning_rate=0.01)
for i in range(100):
output = model(input)
loss = loss_fn(output, target)
loss.backward()
lr = 0.01
optim.step()
if (i % 20) == 0:
print(loss.loss, i)
```