Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/alepheleven/neuralnet
Neural Network library in NumPy✨
https://github.com/alepheleven/neuralnet
neural-network numpy
Last synced: about 1 month ago
JSON representation
Neural Network library in NumPy✨
- Host: GitHub
- URL: https://github.com/alepheleven/neuralnet
- Owner: AlephEleven
- License: mit
- Created: 2023-01-10T02:21:01.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2023-02-06T19:52:31.000Z (about 2 years ago)
- Last Synced: 2024-11-15T04:28:23.554Z (3 months ago)
- Topics: neural-network, numpy
- Language: Python
- Homepage:
- Size: 274 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# NeuralNet
Linear Neural Network library for Python created with pure NumPy
## Table of Contents
* [Features](#features)
* [Updates](#updates)
* [Examples](#examples)
* [Component Template](#component-template)
* [Setup](#setup)
* [Requirements](#requirements)## Features
- Lightweight library for general-purpose machine learning.
- Easy-to-use sequential modelling, with 4+ components to choose from.
- Components for holding backpropagation derivatives and/or weights+bias.
- Simple implementation for easy development.
- Includes One-hot encoding, Loss functions and training loop.## Updates
- Added 3 more descent algorithms: SGD+Momentum, RMSProp, AdamOptimizer
- ```display``` parameter to toggle display output during training loop> - DataLoader, changed how training data is loaded into model
> - Customize batch sizes and enable shuffling setting for data loading, rather than single label/feature pair.
> - Added Stochastic Gradient Descent, also abstracted learning rate term for customizable descent algorithms.
> - ```timed``` parameter for training loop to check current time at each epoch stamp.
> - Up-to-date examples for Iris & MNIST.## Examples
- [Iris Dataset](../main/examples/Iris.ipynb)
- [MNIST Dataset](../main/examples/MNIST.ipynb)## Component Template
Standard templates for creating new components.
### Activation function
```
@dataclass
class Activation:
active_dx: np.ndarray = np.zeros(1)is_mat: bool = False
def __call__(self, X, update=True) -> np.ndarray:
'''
Applies Activation on numpy array, activation(z) = h
Derivatives are:
dh/dz = ACTIVATION DERIVAITVE
'''
#ACTIVATION CODE
activationX = ...if update:
#ACTIVATION DERIVATIVE CODE
self.active_dx = ...return activationX
```### Loss function
```
@dataclass
class CoolLoss:
def __call__(self, y, ypred):
'''
Returns Cool Loss and it's derivative on two numpy vectors, MATH FOR LOSS FUNCTION
Derivatives are:
dL/dypred = LOSS DERIVATIVE
'''
#LOSS CODE
cool = ...
#LOSS DERIVATIVE CODE
cool_dx = ...return cool, cool_dx
```## Setup
Download LinearNet.py and place in current directory. For Colab, drag into ```Files```.
## Requirements
### Packages:
- ```numpy```
- ```dataclasses```
- ```datetime```