An open API service indexing awesome lists of open source software.

https://github.com/maximilianfeldthusen/simplenn


https://github.com/maximilianfeldthusen/simplenn

cpp dataset neural-network train

Last synced: 6 months ago
JSON representation

Awesome Lists containing this project

README

          

## Documentation

### simpleNN

The code implements a simple feedforward neural network in C++. Below, I will explain the main components and functionality of the code:

### Class Definition

The `NeuralNetwork` class encapsulates the structure and behavior of the neural network. It includes the following:

1. **Constructor**: `NeuralNetwork(int inputSize, int hiddenSize, int outputSize)`: Initializes the neural network with a specified number of input neurons, hidden neurons, and output neurons. It also calls the `initializeWeights()` method to set up the weights.

2. **Weights**: The network has two layers of weights:
- `weightsInputHidden`: Weights connecting the input layer to the hidden layer.
- `weightsHiddenOutput`: Weights connecting the hidden layer to the output layer.

3. **Activation Function**:
- `sigmoid(double x)`: Implements the sigmoid activation function, which maps input values to a range between 0 and 1.
- `sigmoidDerivative(double x)`: Computes the derivative of the sigmoid function, useful for backpropagation.

### Methods

1. **initializeWeights()**:
- Initializes the weights for both layers randomly in the range of [-1, 1]. This randomness helps in breaking symmetry during training.
- Uses `std::rand()` to generate random weights.

2. **feedForward(const std::vector& input)**:
- Implements the feedforward process, where input data is passed through the network to produce an output.
- It computes the activations for the hidden layer and then the output layer using the weights and the sigmoid function.

3. **train(const std::vector>& trainingData, const std::vector& labels, int epochs, double learningRate)**:
- Trains the neural network using the provided training data and labels for a specified number of epochs and learning rate.
- The training process consists of:
- **Forward Pass**: Calculate the output from the network for the given input.
- **Backpropagation**: Compute the error at the output and hidden layers, then adjust the weights based on this error.
- **Weight Update**: Adjust the weights using the calculated errors and the learning rate.

### Main Function

1. **Instantiating the Neural Network**:
- Creates an instance of `NeuralNetwork` with 3 input neurons, 5 hidden neurons, and 1 output neuron.

2. **Training Data**:
- Provides a small dataset for training the network, which consists of four input vectors and corresponding labels. This dataset can represent a simple logical function (e.g., XOR).

3. **Training the Network**:
- Calls the `train` method to train the neural network using the defined training data, labels, 10,000 epochs, and a learning rate of 0.1.

4. **Testing the Network**:
- After training, the network is tested with a new input vector `{1.0, 0.0, 0.0}`.
- The output from the network is printed to the console.

### Summary

This code defines a simple feedforward neural network that can be trained on basic datasets. It demonstrates key concepts such as weight initialization, activation functions, forward propagation, and backpropagation for training the neural network. The output of the network after training can be interpreted as predictions based on the input provided.