Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dimits-ts/numpy-mlp-classifier
Implementing a logistic regression and neural network models with batch stochastic gradient descent using numpy
https://github.com/dimits-ts/numpy-mlp-classifier
artificial-intelligence classification-algorithm machine-learning neural-network
Last synced: about 9 hours ago
JSON representation
Implementing a logistic regression and neural network models with batch stochastic gradient descent using numpy
- Host: GitHub
- URL: https://github.com/dimits-ts/numpy-mlp-classifier
- Owner: dimits-ts
- Created: 2023-01-13T10:14:27.000Z (almost 2 years ago)
- Default Branch: master
- Last Pushed: 2023-02-04T13:22:52.000Z (almost 2 years ago)
- Last Synced: 2024-04-22T02:45:09.991Z (7 months ago)
- Topics: artificial-intelligence, classification-algorithm, machine-learning, neural-network
- Language: Python
- Homepage:
- Size: 3.02 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# numpy-mlp-classifier
A project implementing a Logistic Regression classifier, a shallow MLP neural network and a batch stochastic gradient descent variation
using numpy.## Installation
You need to download the `numpy` and `matplotlib` libraries, as well as `tensorflow` if you wish to use the preproccessed datasets
included in this project. To do so run the following commands in your terminal (pip must be installed):
- `pip install numpy`
- `pip install matplotlib`
- `pip install tensorflow`Then download the project.
## Usage
An example of training and using a batch stochastic MLP model is provided below:
```py
from models.mlp import ShallowNetwork
from lib.load_mnist import load_data
from lib.common import get_accuracy, sigmoid, sigmoid_prime, binary_x_entropy, binary_x_entropy_primeimport numpy as np
import matplotlib.pyplot as pltINPUT_SIZE = 784
OUTPUT_SIZE = 1
PATIENCE = 5
TOLERANCE = 1e-3print("Loading data...")
data = load_data()# Full documentation for the parameters, their values and their usage is provided in the docstring of the class's constructor.
# Briefly speaking, we set the number of neurons for each layer of the network, the learning rate, the activation and cost functions
# as well as the parameters used internally by the early stopping algorithm.
classifier = ShallowNetwork(input_size=INPUT_SIZE, hidden_size=25, output_size=OUTPUT_SIZE, eta=0.2,
patience=PATIENCE, tolerance=TOLERANCE,
activation_func=sigmoid, activation_func_prime=sigmoid_prime, cost_func=binary_x_entropy,
cost_func_prime=binary_x_entropy_prime)print("Training classifier...")
# Includes validation data in order to avoid overfitting
epochs, val_error, train_cost_history = classifier.train(data.x_train, data.y_train, data.x_valid, data.y_valid)# Training results
print("Mean validation loss: ", val_error)
train_labels, _ = classifier.predict(data.x_train)
print("Training accuracy: ", round(get_accuracy(train_labels, data.y_train), 3))# Testing results
test_labels, test_error = classifier.predict(data.x_test, data.y_test)
print("Mean testing loss: ", test_error)
print("Testing accuracy: ", round(get_accuracy(test_labels, data.y_test), 3))
```A preproccessed dataset is available by using the `load_mnist.py` file in the `lib` module.
You can use each of the models by importing them from the `models` module. API documentation is provided in the form of docstrings in the
source files.The `run` module contains executable code that displays various use cases such as using the classifiers for hyper-parameter grid search.
## Documentation
A high-level overview of the project is provided in the `documentation.pdf` file, including notes, observations, graphs, and performance
characteristics.