Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mathiasotnes/back-propagation
Neural network framework. The back-propagation algorithm is implemented with numpy, and the package supports basic activation functions, loss functions and neural architectures.
https://github.com/mathiasotnes/back-propagation
back-propagation deep-learning neural-network numpy
Last synced: 19 days ago
JSON representation
Neural network framework. The back-propagation algorithm is implemented with numpy, and the package supports basic activation functions, loss functions and neural architectures.
- Host: GitHub
- URL: https://github.com/mathiasotnes/back-propagation
- Owner: Mathiasotnes
- Created: 2024-01-24T21:00:34.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2024-11-03T23:17:56.000Z (2 months ago)
- Last Synced: 2024-12-18T07:15:19.760Z (19 days ago)
- Topics: back-propagation, deep-learning, neural-network, numpy
- Language: Python
- Homepage:
- Size: 54.7 KB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Deep-Learning
=============Implementation of simple back-propagation using numpy.
Installation
------------
You can install `Deep-Learning` using pip:```bash
pip install git+https://github.com/Mathiasotnes/Deep-Learning.git
```Usage
-----Quickly set up a neural network with multiple layers, including a softmax output layer, using the Deep Learning Library.
### Example: Multi-Layer Network with Softmax Output
```python
import numpy as np
from brain_of_mathias.models import Layer, Network
from brain_of_mathias.activations import ReLU, Softmax
from brain_of_mathias.losses import MSE# Sample data - replace with actual data
X_train = np.array([...]) # Input features
y_train = np.array([...]) # Target labels# Define a network with desired layers
layer1 = Layer(input_size=..., number_of_neurons=..., activation=ReLU())
layer2 = Layer(input_size=..., number_of_neurons=..., activation=ReLU())
output_layer = Layer(input_size=..., number_of_neurons=..., activation=Softmax())# Initialize the network with the layers
network = Network([layer1, layer2, output_layer], loss_function=MSE())# Train the network
network.fit(X_train, y_train, learning_rate=0.01, epochs=500)# Predict
network.predict(X_test)
```Features
--------
- Custom activation and loss functions.
- Extensible model architecture.
- Utilities for common operations.Repo Activity
-------------
![Alt](https://repobeats.axiom.co/api/embed/20c237ee2eb3e404e339facea0ea8f99070ab15e.svg "Repobeats analytics image")