https://github.com/vtramo/neural-networks-experiments-from-scratch
The objective of this repository is to provide a learning and experimentation environment to better understand the details and fundamental concepts of neural networks by building neural networks from scratch.
https://github.com/vtramo/neural-networks-experiments-from-scratch
activation-functions backpropagation batch-learning classification-problem early-stopping gradient-based-algorithm improved-resilient-backpropagation irprop kfold-cross-validation minibatch momentum neural-network numpy online-learning python python3 resilient-backpropagation rprop softmax stochastic-gradient-descent
Last synced: 2 months ago
JSON representation
The objective of this repository is to provide a learning and experimentation environment to better understand the details and fundamental concepts of neural networks by building neural networks from scratch.
- Host: GitHub
- URL: https://github.com/vtramo/neural-networks-experiments-from-scratch
- Owner: vtramo
- Created: 2023-06-08T19:13:27.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-07-22T08:39:26.000Z (almost 2 years ago)
- Last Synced: 2025-01-18T03:26:10.239Z (4 months ago)
- Topics: activation-functions, backpropagation, batch-learning, classification-problem, early-stopping, gradient-based-algorithm, improved-resilient-backpropagation, irprop, kfold-cross-validation, minibatch, momentum, neural-network, numpy, online-learning, python, python3, resilient-backpropagation, rprop, softmax, stochastic-gradient-descent
- Language: Python
- Homepage:
- Size: 12.2 MB
- Stars: 1
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Neural Networks Experiments From Scratch
This repository contains code for creating and training neural networks using only basic libraries such as [NumPy](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwi-u9-1r7T_AhXT8rsIHaNlAdoQFnoECAwQAQ&url=https%3A%2F%2Fnumpy.org%2F&usg=AOvVaw3L2i9HVc9ZeynETpNrPxO-) for array manipulation and numerical calculations.
## Usage example
```python
from nnkit.core.neuronet import DenseLayer, DenseNetwork
from nnkit.core.activations import Softmax, ReLU, Sigmoid, Tanh
from nnkit.core.losses import CrossEntropySoftmax
from nnkit.datasets import mnist
from nnkit.datasets.utils import DataLabelSet, one_hot
from nnkit.training.neurotrain import NetworkTrainer
from nnkit.training.update_rules import SGD, RPropPlus, IRPropPlus, RPropMinus, IRPropMinus
from nnkit.training.stopping import GLStoppingCriterion
from nnkit.training.metrics import Accuracy, MetricsEvaluatorif __name__ == '__main__':
# Build Network
net = DenseNetwork(
DenseLayer(num_inputs=784, num_neurons=256, activation_function=Tanh()),
DenseLayer(num_neurons=10, activation_function=Softmax())
)# Load data / Data pre-processing
(train_images, train_labels), (test_images, test_labels) = mnist.load_data()
train_images = train_images.reshape((60000, 28 * 28))
train_images = (train_images.astype('float32') / 255)
train_labels = one_hot(train_labels)
test_images = test_images.reshape((10000, 28 * 28))
test_images = test_images.astype('float32') / 255
test_labels = one_hot(test_labels)# Training data / Validation data
training_set = DataLabelSet(train_images, train_labels, batch_size=len(train_images), name='training')
training_set, validation_set = training_set.split(
split_factor=0.2,
split_set_batch_size=len(train_images),
split_set_name='validation'
)# Train the network
trainer = NetworkTrainer(
net=net,
update_rule=IRPropPlus(),
loss_function=CrossEntropySoftmax(),
metrics=[Accuracy()]
)history = trainer.train_network(training_set, validation_set, epochs=30)
# Test the network
test_set = DataLabelSet(test_images, test_labels, batch_size=len(test_images), name='test')
evaluator = MetricsEvaluator(net, metrics=[Accuracy()], loss_function=CrossEntropySoftmax())
metrics = evaluator.compute_metrics(test_set)
print(metrics)
```