Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/scitator/kittylyst
A tiny Catalyst-like experiment runner framework on top of micrograd.
https://github.com/scitator/kittylyst
Last synced: 4 days ago
JSON representation
A tiny Catalyst-like experiment runner framework on top of micrograd.
- Host: GitHub
- URL: https://github.com/scitator/kittylyst
- Owner: Scitator
- License: mit
- Created: 2020-09-26T16:02:15.000Z (about 4 years ago)
- Default Branch: master
- Last Pushed: 2021-01-18T16:01:58.000Z (almost 4 years ago)
- Last Synced: 2024-11-02T02:33:41.579Z (11 days ago)
- Language: Jupyter Notebook
- Homepage:
- Size: 555 KB
- Stars: 52
- Watchers: 6
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Kittylyst
![kitty](assets/kitty.jpg)
A tiny [Catalyst](https://github.com/catalyst-team/catalyst)-like
experiment runner framework on top of
[micrograd](https://github.com/karpathy/micrograd).Implements `Experiment`, `Runner` and `Callback` Catalyst-core abstractions
and has extra [PyTorch](https://github.com/pytorch)-like micrograd modules -
`MicroLoader`, `MicroCriterion`, `MicroOptimizer` and `MicroScheduler`.Every module is tiny, with about 100 lines of code (even this readme).
However, this is enough to make `Kittylyst` easily extendable
for any number of data sources and support multi-stage experiments,
as the demo notebook shows.Potentially useful for educational purposes.
### Installation
```bash
pip install kittylyst
```### Example usage
```python
from micrograd.nn import MLP
import kittylyst as ktloaders = {"train": kt.MicroLoader(...), "valid": kt.MicroLoader(...)}
model = MLP(2, [16, 16, 1])
criterion = kt.MicroCriterion()
optimizer = kt.MicroOptimizer(model)
scheduler = kt.MicroScheduler(optimizer, num_epochs=10)
experiment = kt.Experiment(
model=model,
criterion=criterion,
optimizer=optimizer,
scheduler=scheduler,
loaders=loaders,
num_epochs=10,
callbacks=[
kt.CriterionCallback(),
kt.AccuracyCallback(),
kt.OptimizerCallback(),
kt.SchedulerCallback(),
kt.LoggerCallback(),
],
verbose=True,
)kt.SupervisedRunner().run_experiment(experiment)
```### Running an experiment
The notebook `demo.ipynb` provides a full demo of
running an `Experiment` with `SupervisedRunner`
for binary classification task.
This is achieved by training `MLP` from `micrograd` module
with a simple svm "max-margin" binary classification loss (`MicroCriterion`)
and SGD (`MicroOptimizer`) with learning rate decay (`MicroScheduler`).As shown in the notebook,
using a 2-layer neural net with two 16-node hidden layers
we achieve the following decision boundary on the moon dataset:![2d neuron](assets/moon_mlp.png)
### Running codestyle
To run the codestyle check you will have to install
[catalyst-codestyle](https://github.com/catalyst-team/codestyle). Then simply:```bash
catalyst-make-codestyle
```### License
MIT