Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/popcornell/keras-triplet-center-loss
Simple Keras implementation of Triplet-Center Loss on the MNIST dataset
https://github.com/popcornell/keras-triplet-center-loss
center-loss keras machine-learning mnist tensorflow triplet-loss
Last synced: 23 days ago
JSON representation
Simple Keras implementation of Triplet-Center Loss on the MNIST dataset
- Host: GitHub
- URL: https://github.com/popcornell/keras-triplet-center-loss
- Owner: popcornell
- License: gpl-3.0
- Created: 2019-05-28T08:51:31.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2019-07-03T17:45:08.000Z (over 5 years ago)
- Last Synced: 2023-03-06T02:39:02.519Z (over 1 year ago)
- Topics: center-loss, keras, machine-learning, mnist, tensorflow, triplet-loss
- Language: Python
- Homepage:
- Size: 4.34 MB
- Stars: 42
- Watchers: 5
- Forks: 11
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# keras-triplet-center-loss
A simple Keras implementation of Triplet-Center Loss on the MNIST dataset.
As a reference in this repository also implementations of other two similar losses,
Center-Loss and Triplet-Loss are included.The Center-Loss implementation is from **shamangary**: https://github.com/shamangary/Keras-MNIST-center-loss-with-visualization
The Triplet-Loss implementation is from **KinWaiCheuk**: https://github.com/KinWaiCheuk/Triplet-net-keras------
### Triplet-Center LossTriplet-Center Loss has been introduced by He et al. in https://arxiv.org/abs/1803.06189.
It is an "hybrid" loss between Center Loss and Triplet Loss that allows to maximise inter-class distance and
minimize intra-class distance.### Details
In this repository a simple implementation on the MNSIT or alternatively Fashion MNIST is
shown.Running **main.py** will start sequentially 4 training routines with 4 different losses:
* Categorical Crossentropy only
* Center-loss + Categorical Crossentropy
* Triplet-loss + Categorical Crossentropy
* Triplet-Center loss + Categorical CrossentropyIn Folder **runs** there will be the results of those models, including Tensorboard summaries.
Also T-SNE is run on the embeddings to visualize how the network internal representation changes as the loss is changed.----
##### triplet-center loss, T-SNE on internal representation (Train Data):
![Image of Triplet_Center_Loss](https://github.com/popcornell/keras-triplet-center-loss/blob/master/runs/triplet_center_loss/Samples%20from%20Train%20Data%2C%20triplet_center_loss.png)
---
##### Center loss, T-SNE on internal representation (Train Data):
![Image of _Center_Loss](https://github.com/popcornell/keras-triplet-center-loss/blob/master/runs/center_loss/Samples%20from%20Train%20Data%2C%20center_loss.png)
----
##### Triplet loss, T-SNE on internal representation (Train Data):
![Image of _Triplet_Loss](https://github.com/popcornell/keras-triplet-center-loss/blob/master/runs/triplet_loss/Samples%20from%20Train%20Data%2C%20triplet_loss.png)
As it can be seen the triplet-center loss maximises the inter-class distance as the Triplet Loss while keeping the Center-loss characteristic of minimizing intra-class distance.
Another advantage of Triplet-Center loss is that it does not need advanced batching and triplet selection mining techniques as the Triplet-Loss does.