Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/denosaurs/netsaur
Powerful machine learning, accelerated by WebGPU
https://github.com/denosaurs/netsaur
ai artificial-intelligence deep-learning deep-neural-networks deno edge-computing gpu-acceleration gpu-computing hacktoberfest machine-learning ml neural-network rust safetensors serverless typescript wasm webassembly webgpu
Last synced: 24 days ago
JSON representation
Powerful machine learning, accelerated by WebGPU
- Host: GitHub
- URL: https://github.com/denosaurs/netsaur
- Owner: denosaurs
- License: mit
- Created: 2021-05-31T16:24:05.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2024-03-04T20:11:54.000Z (8 months ago)
- Last Synced: 2024-04-14T01:00:47.901Z (7 months ago)
- Topics: ai, artificial-intelligence, deep-learning, deep-neural-networks, deno, edge-computing, gpu-acceleration, gpu-computing, hacktoberfest, machine-learning, ml, neural-network, rust, safetensors, serverless, typescript, wasm, webassembly, webgpu
- Language: Rust
- Homepage: https://deno.land/x/netsaur
- Size: 26.6 MB
- Stars: 192
- Watchers: 7
- Forks: 5
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
- awesome-deno - netsaur - Powerful machine learning, accelerated by WebGPU (Modules / Machine learning)
README
Netsaur
## Powerful Machine Learning library for Deno
## Installation
There is no installation step required. You can simply import the library and
you're good to go :)## Features
- Lightweight and easy-to-use neural network library for
[Deno](https://deno.com).
- Blazingly fast and efficient.
- Provides a simple API for creating and training neural networks.
- Can run on both the CPU and the GPU (WIP).
- Allows you to simply run the code without downloading any prior dependencies.
- Perfect for serverless environments.
- Allows you to quickly build and deploy machine learning models for a variety
of applications with just a few lines of code.
- Suitable for both beginners and experienced machine learning practitioners.### Backends
- [CPU](./src/backends/cpu/) - Native backend written in Rust.
- [WASM](./src/backends/wasm/) - WebAssembly backend written in Rust.
- [GPU](./src/backends/gpu/) (TODO)### Examples
- XOR ([CPU](./examples/xor_cpu.ts), [WASM](./examples/xor_wasm.ts))
- Linear Regression ([CPU](./examples/linear_cpu.ts),
[WASM](./examples/linear_wasm.ts))
- Filters ([CPU](./examples/filters/conv.ts),
[WASM](./examples/filters/conv_wasm.ts))
- Mnist ([CPU](./examples/mnist), [WASM](./examples/mnist))### Maintainers
- Dean Srebnik ([@load1n9](https://github.com/load1n9))
- CarrotzRule ([@carrotzrule123](https://github.com/CarrotzRule123))
- Pranev ([@retraigo](https://github.com/retraigo))### QuickStart
This example shows how to train a neural network to predict the output of the
XOR function our speedy CPU backend written in
[Rust](https://www.rust-lang.org/).```typescript
import {
Cost,
CPU,
DenseLayer,
Sequential,
setupBackend,
SigmoidLayer,
tensor2D,
} from "jsr:@denosaurs/netsaur";/**
* Setup the CPU backend. This backend is fast but doesn't work on the Edge.
*/
await setupBackend(CPU);/**
* Creates a sequential neural network.
*/
const net = new Sequential({
/**
* The number of minibatches is set to 4 and the output size is set to 2.
*/
size: [4, 2],/**
* The silent option is set to true, which means that the network will not output any logs during trainin
*/
silent: true,/**
* Defines the layers of a neural network in the XOR function example.
* The neural network has two input neurons and one output neuron.
* The layers are defined as follows:
* - A dense layer with 3 neurons.
* - sigmoid activation layer.
* - A dense layer with 1 neuron.
* -A sigmoid activation layer.
*/
layers: [
DenseLayer({ size: [3] }),
SigmoidLayer(),
DenseLayer({ size: [1] }),
SigmoidLayer(),
],/**
* The cost function used for training the network is the mean squared error (MSE).
*/
cost: Cost.MSE,
});/**
* Train the network on the given data.
*/
net.train(
[
{
inputs: tensor2D([
[0, 0],
[1, 0],
[0, 1],
[1, 1],
]),
outputs: tensor2D([[0], [1], [1], [0]]),
},
],
/**
* The number of iterations is set to 10000.
*/
10000,
);/**
* Predict the output of the XOR function for the given inputs.
*/
const out1 = (await net.predict(tensor1D([0, 0]))).data;
console.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);const out2 = (await net.predict(tensor1D([1, 0]))).data;
console.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);const out3 = (await net.predict(tensor1D([0, 1]))).data;
console.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);const out4 = (await net.predict(tensor1D([1, 1]))).data;
console.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);
```#### Use the WASM Backend
By changing the CPU backend to the WASM backend we sacrifice some speed but this
allows us to run on the edge.```typescript
import {
Cost,
DenseLayer,
Sequential,
setupBackend,
SigmoidLayer,
tensor1D,
tensor2D,
WASM,
} from "jsr:@denosaurs/netsaur";/**
* Setup the WASM backend. This backend is slower than the CPU backend but works on the Edge.
*/
await setupBackend(WASM);/**
* Creates a sequential neural network.
*/
const net = new Sequential({
/**
* The number of minibatches is set to 4 and the output size is set to 2.
*/
size: [4, 2],/**
* The silent option is set to true, which means that the network will not output any logs during trainin
*/
silent: true,/**
* Defines the layers of a neural network in the XOR function example.
* The neural network has two input neurons and one output neuron.
* The layers are defined as follows:
* - A dense layer with 3 neurons.
* - sigmoid activation layer.
* - A dense layer with 1 neuron.
* -A sigmoid activation layer.
*/
layers: [
DenseLayer({ size: [3] }),
SigmoidLayer(),
DenseLayer({ size: [1] }),
SigmoidLayer(),
],/**
* The cost function used for training the network is the mean squared error (MSE).
*/
cost: Cost.MSE,
});/**
* Train the network on the given data.
*/
net.train(
[
{
inputs: tensor2D([
[0, 0],
[1, 0],
[0, 1],
[1, 1],
]),
outputs: tensor2D([[0], [1], [1], [0]]),
},
],
/**
* The number of iterations is set to 10000.
*/
10000,
);/**
* Predict the output of the XOR function for the given inputs.
*/
const out1 = (await net.predict(tensor1D([0, 0]))).data;
console.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);const out2 = (await net.predict(tensor1D([1, 0]))).data;
console.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);const out3 = (await net.predict(tensor1D([0, 1]))).data;
console.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);const out4 = (await net.predict(tensor1D([1, 1]))).data;
console.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);
```### Documentation
The full documentation for Netsaur can be found
[here](https://deno.land/x/netsaur/mod.ts).### License
Netsaur is licensed under the MIT License.