Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/sunsided/neural-matlab
Neural Network experiments in MATLAB
https://github.com/sunsided/neural-matlab
artificial-intelligence experiment gradient-descent matlab neural-network
Last synced: 13 days ago
JSON representation
Neural Network experiments in MATLAB
- Host: GitHub
- URL: https://github.com/sunsided/neural-matlab
- Owner: sunsided
- Created: 2016-08-24T08:12:45.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2016-08-26T22:10:43.000Z (over 8 years ago)
- Last Synced: 2024-11-09T08:17:28.704Z (about 2 months ago)
- Topics: artificial-intelligence, experiment, gradient-descent, matlab, neural-network
- Language: Matlab
- Homepage:
- Size: 31.3 KB
- Stars: 2
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Neural Networks in MATLAB
Experiments with Neural Networks in MATLAB.
```matlab
a = input;for j=1:numel(layers)
weights = layers{j}.theta;
sigma = layers{j}.sigma;
z = weights * [1; a];
a = sigma(z);
endoutput = a;
```Start with `ann.m`. This code implements a training example and utilizes the following functions:
* `feedforward.m`: Runs inputs through the neural network, producing the hypothesis of the result.
* `feedforward_for_training.m`: Like `feedforward.m`, but produces additional output required for the backpropagation stage.
* `backpropagate.m`: Performs the error backpropagation and produces weight deltas.Depending on the configuration, one of the following Gradient Descent algorithms can be used:
* `momentum_gradient_descent.m`: This is regular gradient descent with a fixed learning rate that additionally uses a momentum term to add in fractions of the previous gradient.
* `accelerated_gradient_descent.m`: This implementation utilizes adaptive (i.e. different) delta values per weight parameter instead of a single learning rate and only uses the gradient to check for direction changes.In all cases, the concept of *flat spot elimination* (FSE) can be used to aid in navigating areas of small error gradients by artificially providing an empirically set gradient direction to move along. Small values such as `0.1` might work, disable by setting it to `0`.