Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/retraigo/la-classy
Machine Learning Module for Single Layer Perceptron ML models, written in Rust for Typescript.
https://github.com/retraigo/la-classy
classification deno machine-learning regression rust typescript
Last synced: 2 months ago
JSON representation
Machine Learning Module for Single Layer Perceptron ML models, written in Rust for Typescript.
- Host: GitHub
- URL: https://github.com/retraigo/la-classy
- Owner: retraigo
- License: mit
- Created: 2023-05-23T05:30:17.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-01T06:10:58.000Z (9 months ago)
- Last Synced: 2024-08-04T00:05:37.261Z (4 months ago)
- Topics: classification, deno, machine-learning, regression, rust, typescript
- Language: Rust
- Homepage:
- Size: 713 KB
- Stars: 3
- Watchers: 2
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
- awesome-deno - classy-lala - Single-layer perceptrons for supervised learning tasks. (Modules / Machine learning)
README
La Classy
Single Layer Perceptron (SLP) library for Deno.
This library is written TypeScript and Rust and it uses FFI.
## Why Classy?
- It's fast.
- It gives you some freedom to experiment with different combinations of loss functions, activation functions, etc.
- It's easy to use.## Features
- Optimization Algorithms:
- Gradient Descent
- Stochastic Average Gradients
- Ordinary Least Squares
- Optimizers for updating weights:
- RMSProp
- ADAM
- Schedulers for learning rate:
- One-cycle Scheduler
- Decay
- Regularization
- Activation Functions:
- Linear (regression, SVM, etc.)
- Sigmoid (logistic regression)
- Softmax (multinomial logistic regression)
- Tanh (it's just there)
- Loss Functions:
- Mean Squared Error (regression)
- Mean Absolute Error (regression)
- Cross-Entropy (multinomial classification)
- Binary Cross-Entropy / Logistic Loss (binary classification)
- Hinge Loss (binary classification, SVM)## Quick Example
### Regression
```ts
import { Matrix } from "jsr:@lala/[email protected]";
import {
GradientDescentSolver,
adamOptimizer,
huber,
} from "jsr:@lala/[email protected]";const x = [100, 23, 53, 56, 12, 98, 75];
const y = x.map((a) => [a * 6 + 13, a * 4 + 2]);const solver = new GradientDescentSolver({
// Huber loss is a mix of MSE and MAE
loss: huber(),
// ADAM optimizer with 1 + 1 input for intercept, 2 outputs.
optimizer: adamOptimizer(2, 2),
});// Train for 700 epochs in 2 minibatches
solver.train(
new Matrix(
x.map((n) => [n]),
"f32"
),
new Matrix(y, "f32"),
{ silent: false, fit_intercept: true, epochs: 700, n_batches: 2 }
);const res = solver.predict(
new Matrix(
x.map((n) => [n]),
"f32"
)
);for (let i = 0; i < res.nRows; i += 1) {
console.log(Array.from(res.row(i)), y[i]);
}
```There are other examples in [retraigo/deno-ml](https://github.com/retraigo/deno-ml).
## Documentation
[JSR](https://jsr.io/@lala/classy)
## Maintainers
Pranev ([retraigo](https://github.com/retraigo))
Discord: [Kuro's ~~Chaos Abyss~~ Graveyard](https://discord.gg/A69vvdK)