https://github.com/juselara1/dmae
TensorFlow implementation of the Dissimilarity Mixture Autoencoder: https://arxiv.org/abs/2006.08177
https://github.com/juselara1/dmae
autoencoder clustering deep-clustering deep-learning dissimilarity-mixture-autoencoder tensorflow
Last synced: 6 months ago
JSON representation
TensorFlow implementation of the Dissimilarity Mixture Autoencoder: https://arxiv.org/abs/2006.08177
- Host: GitHub
- URL: https://github.com/juselara1/dmae
- Owner: juselara1
- License: mit
- Created: 2020-06-16T07:24:40.000Z (about 5 years ago)
- Default Branch: main
- Last Pushed: 2022-12-08T07:59:57.000Z (over 2 years ago)
- Last Synced: 2023-03-08T22:38:51.185Z (over 2 years ago)
- Topics: autoencoder, clustering, deep-clustering, deep-learning, dissimilarity-mixture-autoencoder, tensorflow
- Language: Python
- Homepage:
- Size: 9.94 MB
- Stars: 10
- Watchers: 4
- Forks: 4
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Dissimilarity Mixture Autoencoder for Deep Clustering
Tensorflow implementation of the Dissimilarity Mixture Autoencoder:
* Juan S. Lara and Fabio A. González. ["Dissimilarity Mixture Autoencoder for Deep Clustering"](https://arxiv.org/abs/2006.08177) arXiv preprint arXiv:2006.08177 (2020).
## Abstract
The dissimilarity mixture autoencoder (DMAE) is a neural network model for feature-based clustering that incorporates a flexible dissimilarity function and can be integrated into any kind of deep learning architecture. It internally represents a dissimilarity mixture model (DMM) that extends classical methods like Bregman clustering to any convex and differentiable dissimilarity function through the reinterpretation of probabilistic notions as neural network components. Likewise, it leverages from unsupervised representation learning, allowing a simultaneous learning of the clusters and neural network's parameters. Experimental evaluation was performed on image and text clustering benchmark datasets showing that DMAE is competitive in terms of unsupervised classification accuracy and normalized mutual information.
## Usage and Documentation
You can check the official `dmae` [documentation](https://dmae.readthedocs.io/en/latest/index.html).
## Gallery and Examples
* Deep architecture:

* Clustering examples:
* Probabilistic interpretations:
These examples and the paper replication experiments can be found in the [examples](https://github.com/juselara1/dmae/tree/main/examples) folder.
## Installation
You can install `dmae` from PyPi using `pip`, building from source or pulling a preconfigured docker image.
### PyPi
To install `dmae` using `pip` you can run the following command:
```sh
pip install dmae
```*(optional) If you have an environment with the nvidia drivers and CUDA, you can instead run:*
```sh
pip install dmae-gpu
```### Source
You can clone this repository:
```sh
git clone https://github.com/juselara1/dmae.git
```Install the requirements:
```sh
pip install -r requirements.txt
```*(optional) If you have an environment with the nvidia drivers and CUDA, you can instead run:*
```sh
pip install -r requiremets-gpu.txt
```Finally, you can install `dmae` via setuptools:
```sh
pip install --no-deps .
```### Docker
You can pull a preconfigured docker image with `dmae` from DockerHub:
```sh
docker pull juselara/dmae:latest
```*(optional) If you have an environment with the nvidia drivers installed, you can instead run:*
```sh
docker pull juselara/dmae:latest-gpu
```## Citation
```
@misc{lara2020dissimilarity,
title={Dissimilarity Mixture Autoencoder for Deep Clustering},
author={Juan S. Lara and Fabio A. González},
year={2020},
eprint={2006.08177},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```