https://github.com/gugarosa/dropout_rbm
📄 Official implementation regarding the paper "Fine-Tuning Dropout Regularization in Energy-Based Deep Learning".
https://github.com/gugarosa/dropout_rbm
dropout fine-tuning implementation paper rbm
Last synced: 7 months ago
JSON representation
📄 Official implementation regarding the paper "Fine-Tuning Dropout Regularization in Energy-Based Deep Learning".
- Host: GitHub
- URL: https://github.com/gugarosa/dropout_rbm
- Owner: gugarosa
- License: mit
- Created: 2021-03-21T14:34:47.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2022-10-06T12:59:03.000Z (about 3 years ago)
- Last Synced: 2024-10-18T07:39:50.650Z (about 1 year ago)
- Topics: dropout, fine-tuning, implementation, paper, rbm
- Language: Python
- Homepage:
- Size: 16.6 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Fine-Tuning Dropout Regularization in Energy-Based Deep Learning
*This repository holds all the necessary code to run the very-same experiments described in the paper "Fine-Tuning Dropout Regularization in Energy-Based Deep Learning".*
---
## References
If you use our work to fulfill any of your needs, please cite us:
```BibTex
@inproceedings{deRosa:21,
author={de Rosa, Gustavo H. and Roder, Mateus and Papa, Jo{\~a}o P.},
editor={Tavares, Jo{\~a}o Manuel R. S. and Papa, Jo{\~a}o Paulo and Gonz{\'a}lez Hidalgo, Manuel},
title={Fine-Tuning Dropout Regularization in Energy-Based Deep Learning},
booktitle={Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications},
year={2021},
publisher={Springer International Publishing},
address={Cham},
pages={99--108},
isbn={978-3-030-93420-0}
}
```---
## Structure
* `models`: Holds the output history and models files.
* `utils`
* `loader.py`: Utility to load datasets and split them into training, validation and testing sets;
* `objects.py`: Wraps objects instantiation for command line usage;
* `opt.py`: Wraps the optimization pipeline;
* `target.py`: Wraps the optimization target.
---## Package Guidelines
### Installation
Install all the pre-needed requirements using:
```Python
pip install -r requirements.txt
```*If you encounter any problems with the automatic installation of the [learnergy](https://github.com/gugarosa/learnergy) package, contact us.*
### Data configuration
In order to run the experiments, you can use `torchvision` to load pre-implemented datasets.
---
## Usage
### Model Optimization
The experiment is conducted by optimizating an architecture and post-evaluating them. To accomplish such a step, one needs to use the following script:
```Python
python optimization.py -h
```*Note that `-h` invokes the script helper, which assists users in employing the appropriate parameters.*
### Test Reconstruction
Afterward, with the optimized Dropout parameter in hands, one can perform the final reconstruction over the testing test, as follows:
```Python
python test_reconstruction.py -h
```### Bash Script
Instead of invoking every script to conduct the experiments, it is also possible to use the provided shell script, as follows:
```Bash
./pipeline.sh
```Such a script will conduct every step needed to accomplish the experimentation used throughout this paper. Furthermore, one can change any input argument that is defined in the script.
---
## Support
We know that we do our best, but it is inevitable to acknowledge that we make mistakes. If you ever need to report a bug, report a problem, talk to us, please do so! We will be available at our bests at this repository.
---