https://github.com/fwcd/mini-ml
Computation graphs, automatic differentiation and machine learning for Kotlin
https://github.com/fwcd/mini-ml
autograd computation-graph gradient machine-learning tensor
Last synced: 2 months ago
JSON representation
Computation graphs, automatic differentiation and machine learning for Kotlin
- Host: GitHub
- URL: https://github.com/fwcd/mini-ml
- Owner: fwcd
- License: mit
- Created: 2019-02-19T20:34:50.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2019-08-06T00:01:25.000Z (almost 7 years ago)
- Last Synced: 2026-02-15T16:40:08.651Z (3 months ago)
- Topics: autograd, computation-graph, gradient, machine-learning, tensor
- Language: Kotlin
- Homepage:
- Size: 226 KB
- Stars: 4
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# MiniML
A lightweight computation graph library for machine learning written in pure Kotlin.
MiniML optimizes mathematical expressions by backpropagating gradients through a dynamically created expression graph.
## Example
Training a linear model:
```kotlin
val x = placeholder(zeros())
val w = variable(randoms(-10.0, 10.0))
val b = variable(randoms(-10.0, 10.0))
val expected = x * 3.0
val output = (w * x) + b
val cost = (expected - output).square()
val learningRate = 0.0001
val writer = File("cost.dat")
.also { it.createNewFile() }
.printWriter()
for (i in 0 until 2000) {
x.value = randoms(-10.0, 10.0)
val currentCost = cost.forward()
cost.clearGradients()
cost.backward()
w.apply(-w.gradient!! * learningRate)
b.apply(-b.gradient!! * learningRate)
writer.println(currentCost)
}
writer.close()
```

In addition to operator overloading, MiniML also supports modern Kotlin features such as destructuring declarations:
```kotlin
val (rowA, rowB) = matrixOf(
rowOf(2.0, 5.3, 7.8),
rowOf(0.0, 0.9, 1.1)
)
val (cellA, cellB, cellC) = rowA
```