Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/danaugrs/go-tsne
t-Distributed Stochastic Neighbor Embedding (t-SNE) in Go
https://github.com/danaugrs/go-tsne
3d data-science dimensionality-reduction go machine-learning tsne unsupervised-learning visualization
Last synced: 2 months ago
JSON representation
t-Distributed Stochastic Neighbor Embedding (t-SNE) in Go
- Host: GitHub
- URL: https://github.com/danaugrs/go-tsne
- Owner: danaugrs
- License: bsd-3-clause
- Created: 2018-09-03T21:27:53.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2023-12-10T17:52:11.000Z (about 1 year ago)
- Last Synced: 2024-06-21T18:52:29.080Z (7 months ago)
- Topics: 3d, data-science, dimensionality-reduction, go, machine-learning, tsne, unsupervised-learning, visualization
- Language: Go
- Homepage:
- Size: 90.2 MB
- Stars: 203
- Watchers: 12
- Forks: 24
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-golang-repositories - go-tsne - Distributed Stochastic Neighbor Embedding (t-SNE) in Go (Repositories)
README
# go-tsne
A Go implementation of [t-Distributed Stochastic Neighbor Embedding (t-SNE)](https://lvdmaaten.github.io/tsne/), a prize-winning technique for dimensionality reduction particularly well suited for visualizing high-dimensional datasets.
### Usage
Import this library:
```Go
import "github.com/danaugrs/go-tsne/tsne"
```
Create the TSNE object:
```Go
t := tsne.NewTSNE(2, 300, 100, 300, true)
```
The parameters are
* Number of output dimensions
* Perplexity
* Learning rate
* Max number of iterations
* VerbosityThere are two ways to start the t-SNE embedding optimization. The regular way is to provide an `n` by `d` matrix where each row is a datapoint and each column is a dimension:
```Go
Y := t.EmbedData(X, nil)
```
The alternative is to provide a distance matrix directly:
```Go
Y := t.EmbedDistances(D, nil)
```
In either case, the returned matrix `Y` will contain the final embedding.For more fine-grained control, a step function can be provided in either case:
```Go
Y := t.EmbedData(X, func(iter int, divergence float64, embedding mat.Matrix) bool {
fmt.Printf("Iteration %d: divergence is %v\n", iter, divergence)
return false
})
```
The step function has access to the iteration, the current divergence, and the embedding optimized so far.
You can return `true` to halt the optimization.### Examples
Two examples are provided - `mnist2d` and `mnist3d`. They both use the same data - a subset of [MNIST](http://yann.lecun.com/exdb/mnist/) with 2500 handwritten digits. `mnist2d` generates plots throughout the optimization process, and `mnist3d` shows the optimization happening in real-time, in 3D. `mnist3d` depends on [G3N](https://github.com/g3n/engine).
To run an example, `cd` to the example's directory, build it, and execute it, e.g:
```
cd examples/mnist2d
go build
./mnist2d
```### Support
I hope you enjoy using and learning from go-tsne as much as I enjoyed writing it.If you come across any issues, please [report them](https://github.com/danaugrs/go-tsne/issues).