Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kinoute/l-layers-xor-neural-network
A L-Layers XOR Neural Network using only Python and Numpy that learns to predict the XOR logic gates.
https://github.com/kinoute/l-layers-xor-neural-network
arguments deep-learning neural-network numpy python python3 script xor xor-neural-network
Last synced: about 1 month ago
JSON representation
A L-Layers XOR Neural Network using only Python and Numpy that learns to predict the XOR logic gates.
- Host: GitHub
- URL: https://github.com/kinoute/l-layers-xor-neural-network
- Owner: kinoute
- Created: 2019-06-16T18:08:47.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2019-06-24T13:35:17.000Z (over 5 years ago)
- Last Synced: 2024-11-09T14:45:09.358Z (3 months ago)
- Topics: arguments, deep-learning, neural-network, numpy, python, python3, script, xor, xor-neural-network
- Language: Python
- Size: 20.5 KB
- Stars: 2
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# L-Layers XOR Neural Network
A L-Layers XOR Neural Network using only Python and Numpy that learns to predict the XOR logic gates.
## Script
The script was initially made as a `.ipynb` jupyter file and later refactored as a class and a script where arguments can be passed to the neural network.
There are two files:
* `nn.py`: the main script that uses our NeuralNetwork class stored in the other file ;
* `nn_xor_class.py`: our NeuralNetwork class.## Arguments
You can change the behavior of the Neural Network by using arguments when running the script. For example, you can change the activation function of the hidden layers, the learning rate etc. Here are the arguments allowed when running `nn.py`. All arguments are optional.
### Usage
```
python nn.py [-h] [-l LAYERS] [-u UNITS] [-s SIZE] [-i ITERATIONS] [-r LEARNING_RATE]
[-a {sigmoid,tanH,relu,leakyRelu}]-h, --help show this help message and exit
-l LAYERS, --layers LAYERS
Number of layers in your NN (including the output layer). Default: 4.-u UNITS, --units UNITS
Number of units in each hidden layer separated by a
comma (including output layer). Default:
4,2,1.-s SIZE, --size SIZE How many examples should be generated in our training
set. Default: 1000.-i ITERATIONS, --iterations ITERATIONS
Choose the number of iterations we want. Default: 10000.-r LEARNING_RATE, --learning-rate LEARNING_RATE
Pick a Learning rate for your neural Network. Default: 1.5.-a {sigmoid,tanH,relu,leakyRelu}, --activation {sigmoid,tanH,relu,leakyRelu}
Activation function for your hidden layers. The output
layer will always be a sigmoid. Default: "tanH".```