Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dyanni3/vae_oversampler
oversampling minority class using variational autoencoder
https://github.com/dyanni3/vae_oversampler
Last synced: about 1 month ago
JSON representation
oversampling minority class using variational autoencoder
- Host: GitHub
- URL: https://github.com/dyanni3/vae_oversampler
- Owner: dyanni3
- License: mit
- Created: 2019-07-26T20:06:46.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2019-07-29T20:23:27.000Z (over 5 years ago)
- Last Synced: 2024-12-15T17:44:38.311Z (about 1 month ago)
- Language: Python
- Size: 437 KB
- Stars: 0
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# vae_oversampler
[![N|Solid](https://cldup.com/dTxpPi9lDf.thumb.png)](https://nodesource.com/products/nsolid)
[![Build Status](https://travis-ci.org/joemccann/dillinger.svg?branch=master)](https://travis-ci.org/joemccann/dillinger)
vae_oversampler provides an API similar to imblearn to oversample a minority class of a dataset. Under the hood it uses keras to build a variational autoencoder that learns the underlying data probability distribution and then samples from that distribution to generate synthetic minority examples.
### Tech
vae_oversampler uses a number of open source projects to work properly:
* [keras] - for deep learning- to build the variational autoencoder
* [sklearn] - primarily to standard scale your data (optional)
* [numpy] - numerical methodsAnd of course vae_oversampler itself is open source with a [public repository][vae_oversampler] on GitHub.
### Installation
vae_oversampler requires keras to run.
Install the dependencies and install using pip
```sh
$ pip install vae_oversampler
```### Todos
- Write Tests
- Comply with PEP8
- Better error handling
- Add more options for how many samples to generate
- get travis working properly (with TensorFlow)License
----MIT
[//]: # (These are reference links used in the body of this note and get stripped out when the markdown processor does its job. There is no need to format nicely because it shouldn't be seen. Thanks SO - http://stackoverflow.com/questions/4823468/store-comments-in-markdown-syntax)
[vae_oversampler]: