Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/philipperemy/tensorflow-maxout
Maxout Networks TensorFlow implementation presented in https://arxiv.org/abs/1302.4389
https://github.com/philipperemy/tensorflow-maxout
maxout maxout-networks tensorflow tensorflow-maxout
Last synced: about 2 months ago
JSON representation
Maxout Networks TensorFlow implementation presented in https://arxiv.org/abs/1302.4389
- Host: GitHub
- URL: https://github.com/philipperemy/tensorflow-maxout
- Owner: philipperemy
- License: mit
- Created: 2017-01-04T05:42:12.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2018-12-19T07:31:57.000Z (about 6 years ago)
- Last Synced: 2024-11-02T02:33:03.271Z (2 months ago)
- Topics: maxout, maxout-networks, tensorflow, tensorflow-maxout
- Language: Python
- Homepage:
- Size: 76.2 KB
- Stars: 56
- Watchers: 2
- Forks: 17
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Maxout Networks (Ian Goodfellow, Yoshua Bengio - 2013)
Maxout Networks TensorFlow implementation presented in https://arxiv.org/abs/1302.4389## How to run the MNIST experiment?
```bash
# make sure Tensorflow is installed.
git clone [email protected]:philipperemy/tensorflow-maxout.git maxout && cd maxout
python mnist_maxout_example.py MAXOUT # Can pick up from one of those values: LINEAR, RELU, MAXOUT.
```## How to integrate it in your code
It's two lines of code. Sorry I can't make it shorter.
```python
from maxout import max_out
y = tf.matmul(x, W1) + b1
t = max_out(y, num_units=50)
```# Some Results on MNIST dataset
Those results are not meant to reproduce the results of the paper. It's more about showing on how to use the maxout non linearity in the Tensorflow graphs.
## Loss
As expected, Maxout strictly outperforms Sigmoid and ReLU. Having one hidden layer + non linearity helps to have a smaller loss.
## Accuracy
| Model | Accuracy (100 epochs) |
| ------------- |:-------------:|
| MLP Hidden MaxOut | 0.9730 |
| MLP Hidden ReLU | 0.9704 |
| MLP Hidden Sigmoid | 0.9353 |
| MLP Linear | 0.9214 |