Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/haidousm/fine
an artificial neural network framework built from scratch using just Python and Numpy
https://github.com/haidousm/fine
convolutional-neural-networks keras neural-network numpy python scratch-implementation
Last synced: 2 months ago
JSON representation
an artificial neural network framework built from scratch using just Python and Numpy
- Host: GitHub
- URL: https://github.com/haidousm/fine
- Owner: haidousm
- Created: 2021-01-07T14:40:26.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2022-01-06T10:51:27.000Z (almost 3 years ago)
- Last Synced: 2024-10-01T16:08:18.383Z (3 months ago)
- Topics: convolutional-neural-networks, keras, neural-network, numpy, python, scratch-implementation
- Language: Python
- Homepage: https://haidousm.com/fine/
- Size: 57.5 MB
- Stars: 4
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Fine
A keras-like neural network framework built purely using Python and Numpy that's just that, fine.## Table of Contents
[1- How to use](#how-to-use)
[2- Demo](#demo)
[3- Technical Specifications](#technical)## How to use
```
git clone [email protected]:haidousm/fine.git
cd fine
python3 -m pip install -r requirements.txt
```## Demo
### [MNIST Demo Link](https://haidousm.com/fine-mnist-demo/)
Demo was built using javascript for the frontend, and a flask server to serve predictions from the model.Demo model creation & training:
```
from datasets import load_mnist
from models import Sequentialfrom layers import Conv2D
from layers import MaxPool2D
from layers import Flatten
from layers import Densefrom activations import ReLU
from activations import Softmaxfrom loss import CategoricalCrossEntropy
from models.model_utils import Categorical
from optimizers import Adam
X_train, y_train, X_test, y_test = load_mnist()
model = Sequential(
layers=[
Conv2D(16, (1, 3, 3)),
ReLU(),
Conv2D(16, (16, 3, 3)),
ReLU(),
MaxPool2D((2, 2)),Conv2D(32, (16, 3, 3)),
ReLU(),
Conv2D(32, (32, 3, 3)),
ReLU(),
MaxPool2D((2, 2)),Flatten(),
Dense(1568, 64),
ReLU(),
Dense(64, 64),
ReLU(),
Dense(64, 10),
Softmax()
],
loss=CategoricalCrossEntropy(),
optimizer=Adam(decay=1e-3),
accuracy=Categorical()
)model.train(X_train, y_train, epochs=5, batch_size=120, print_every=100)
model.evaluate(X_test, y_test, batch_size=120)```
## Technical Specifications
### Layers
- [X] Dense Layer
- [X] Dropout Layer
- [X] Flatten Layer
- [X] 2D Convolutional Layer
- [X] Max Pool Layer### Activation Functions
- [X] Rectified Linear (ReLU)
- [X] Sigmoid
- [X] Softmax
- [X] Linear### Loss Functions
- [X] Categorical Cross Entropy
- [X] Binary Cross Entropy
- [X] Mean Squared Error### Optimizers
- [X] Stochastic Gradient Descent (SGD) with rate decay and momentum
- [X] Adaptive Moment Estimation (ADAM)