Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/Bartzi/visual-backprop-mxnet
Implementation of Visual Backprop for MXNet
https://github.com/Bartzi/visual-backprop-mxnet
convolutional-neural-networks deep-learning mxnet visualization
Last synced: 3 months ago
JSON representation
Implementation of Visual Backprop for MXNet
- Host: GitHub
- URL: https://github.com/Bartzi/visual-backprop-mxnet
- Owner: Bartzi
- License: gpl-3.0
- Created: 2017-06-26T15:08:04.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-07-24T12:41:53.000Z (over 6 years ago)
- Last Synced: 2024-06-12T05:08:28.608Z (5 months ago)
- Topics: convolutional-neural-networks, deep-learning, mxnet, visualization
- Language: Python
- Size: 25.4 KB
- Stars: 8
- Watchers: 2
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-MXNet - VisualBackProp
README
# Implementation of VisualBackProp for MXNet
This repo contains an implementation of [VisualBackProp](https://arxiv.org/abs/1611.05418) for [MXNet](http://mxnet.io/).
# Requirements
0. Make sure to use **Python 3** (the code creating the symbol for VisualBackProp should also work with Python2)
1. Install MXNet as shown [here](http://mxnet.io/get_started/install.html). Make sure that you have at least `MXNet v0.10.0`!
2. Install further requirements by issuing `pip install -r requirements.txt`
3. If you want to use the `show_progress.py` script you also need to install tkinter (in case the script does not work right from the start)# Usage
The file `vis_backprop_test.py` contains a sample that trains MNIST and performs VisualBackprop on every forward pass.
In order to use the script, you have to do the following:1. start the script `show_progress.py` by issuing `python show_progress.py`
2. start the training of the network by issuing `python vis_backprop_test.py`
- if you want to change some options you can get a list of all supported by options by adding `-h` to the `python` command.# Adding VisualBackProp to your own code
Adding VisualBackProp to your own code ist quite easy. You have to perform the following steps:
1. adapt your `symbol` definition, by adding a call to `insights.build_visual_backprop_symbol` after the activation of the convolutional layer you want to visualize. ([see this line](https://github.com/Bartzi/visual-backprop-mxnet/blob/master/vis_backprop_test.py#L29))
2. keep the returned visualization `symbol` for the visualization pass.
3. If you want to use VisualBackProp during the training, for each training step, you have to create an instance of the `VisualBackpropPlotter` class, providing an ip and port for the visualization endpoint. ([see this line](https://github.com/Bartzi/visual-backprop-mxnet/blob/master/vis_backprop_test.py#L71))
4. get one sample image where you want to visualize the convolution ([see this line](https://github.com/Bartzi/visual-backprop-mxnet/blob/master/vis_backprop_test.py#L67)).
5. add a new `batch_end_callback` to your model, by calling the method `get_callback` of the created `VisualBackpropPlotter` object ([see this line](https://github.com/Bartzi/visual-backprop-mxnet/blob/master/vis_backprop_test.py#L72)).
6. start the `show_progress.py` tool, that you can find in the `utils` directory, by issuing the following command: `python show_progress.py`. (This tool is the visualization endpoint)
7. Sit back and enjoy!# How does it work?
`insights.build_visual_backprop_symbol` adds a new subgraph to the computational graph that performs the necessary operations for VisualBackProp (see the paper for more details).
During a forward pass MXNet calls the callback implemented in `VisualBackpropPlotter`. This callback copies the current params and performs a forward pass with the given input data.
After this forward pass, the output generated by the VisualBackProp branch is extracted and converted into an image. Together with the original image, this image is send to the visualization endpoint.You can also use this implementation during testing, by performing the exact same steps as the `VisualBackpropPlotter` you just don't need to send the resulting image to someone, but can save it to the disk.
# License
This code is licensed under the GPLv3 license.