{"id":13800941,"url":"https://github.com/denosaurs/netsaur","last_synced_at":"2025-04-04T05:03:38.459Z","repository":{"id":43655354,"uuid":"372567011","full_name":"denosaurs/netsaur","owner":"denosaurs","description":"Powerful Powerful Machine Learning library with GPU, CPU and WASM backends","archived":false,"fork":false,"pushed_at":"2024-10-02T17:09:04.000Z","size":153361,"stargazers_count":228,"open_issues_count":6,"forks_count":4,"subscribers_count":8,"default_branch":"main","last_synced_at":"2024-10-30T01:22:24.765Z","etag":null,"topics":["ai","artificial-intelligence","deep-learning","deep-neural-networks","deno","edge-computing","gpu-acceleration","gpu-computing","hacktoberfest","machine-learning","ml","neural-network","rust","safetensors","serverless","typescript","wasm","webassembly","webgpu"],"latest_commit_sha":null,"homepage":"https://netsaur.deno.dev/","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/denosaurs.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-05-31T16:24:05.000Z","updated_at":"2024-10-22T09:18:43.000Z","dependencies_parsed_at":"2023-01-23T01:46:06.757Z","dependency_job_id":"a69e8367-8e35-4f7b-b463-8b997fc0295c","html_url":"https://github.com/denosaurs/netsaur","commit_stats":{"total_commits":141,"total_committers":7,"mean_commits":"20.142857142857142","dds":0.5035460992907801,"last_synced_commit":"c1efc3e2df6e2aaf4a1672590a404143203885a6"},"previous_names":[],"tags_count":31,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/denosaurs%2Fnetsaur","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/denosaurs%2Fnetsaur/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/denosaurs%2Fnetsaur/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/denosaurs%2Fnetsaur/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/denosaurs","download_url":"https://codeload.github.com/denosaurs/netsaur/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247122167,"owners_count":20887219,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","artificial-intelligence","deep-learning","deep-neural-networks","deno","edge-computing","gpu-acceleration","gpu-computing","hacktoberfest","machine-learning","ml","neural-network","rust","safetensors","serverless","typescript","wasm","webassembly","webgpu"],"created_at":"2024-08-04T00:01:17.816Z","updated_at":"2025-04-04T05:03:38.439Z","avatar_url":"https://github.com/denosaurs.png","language":"Rust","readme":"\u003cp align=\"center\"\u003e\n \u003cimg src=\"https://raw.githubusercontent.com/denosaurs/netsaur/main/assets/netsaur.svg\" width=\"80rem\" /\u003e\n \u003cbr/\u003e\n \u003ch1 align=\"center\"\u003eNetsaur\u003c/h1\u003e\n\u003c/p\u003e\n\u003cbr/\u003e\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://github.com/denosaurs/netsaur/stargazers\"\u003e\n    \u003cimg alt=\"netsaur stars\" src=\"https://img.shields.io/github/stars/denosaurs/netsaur?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABwAAAAcCAYAAAEFCu8CAAAABGdBTUEAALGPC/xhBQAAADhlWElmTU0AKgAAAAgAAYdpAAQAAAABAAAAGgAAAAAAAqACAAQAAAABAAAAHKADAAQAAAABAAAAHAAAAABHddaYAAABxElEQVRIDe2Wv04CQRDGAQuoTKQ2ITyADZWVJZWV+gJYWBNqKh/C16CRBlprWxsTE2NJfABNOH9z7Gzm2Nv7A8TCOMnHzs1838ze3e4ejUbMkiRZS64lP1x8MjTFr2DQE6Gl2nI+7POARXAmdbas44ku8eLGhU9UckRliX6qxM9sQvz0vrcVaaKJKdsSNO7LOtK1kvcbaXVRu4LMz9kgKoYwBq/KLBi/yC2DQgSnBaLMQ88Tx7Q3AVkDKHpgBdoak5HrCSjuaAW/6zOz+u/Q3ZfcVrhliuaPYCAqsSJekIO/TlWbn2BveAH5JZBVUWayusZW2ClTuPzMi6xTIp5abuBHxHLcZSyzkxHF1uNJRrV9gXBhOl7h6wFW/FqcaGILEmsDWfg9G//3858Az0lWaHhm5dP3i9JoDtTm+1UrUdMl72OZv10itfx3zOYpLAv/FPQNLvFj35Bnco/gzeCD72H6b4JYaDTpgidwaJOa3bCji5BsgYcDdJUamSMi2lQTCEbgu0Zz4Y5UX3tE3K/RTKny3qNWdst3UWU8sYtmU40py2Go9o5zC460l/guJjm1leZrjaiH4B4cVxUK12mGVTV/j/cDqcFClUX01ZEAAAAASUVORK5CYII=\" /\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://github.com/denosaurs/netsaur/releases/latest/\"\u003e\n    \u003cimg alt=\"netsaur releases\" src=\"https://img.shields.io/github/v/release/denosaurs/netsaur?logo=github\" /\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://github.com/denosaurs/netsaur/blob/master/LICENSE\"\u003e\n    \u003cimg alt=\"netsaur License\" src=\"https://img.shields.io/github/license/denosaurs/netsaur?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABwAAAAcCAYAAAEFCu8CAAAABGdBTUEAALGPC/xhBQAAADhlWElmTU0AKgAAAAgAAYdpAAQAAAABAAAAGgAAAAAAAqACAAQAAAABAAAAHKADAAQAAAABAAAAHAAAAABHddaYAAAC5UlEQVRIDd2WPWtVQRCGby5pVASLiGghQSxyG8Ui2KWwCfkH9olY2JneQkiR0oCIxH/gB+qVFDYBIWBAbAIRSbCRpLXwIxLiPT7vnNm9e87ZxJtUwYH3zO47Mzv7Mbv3tlo5KYriGtgAJ81OY1ENdG/YI4boFEOI911BXgY/pdtwGuAtXpvmB1tAXHDnUolE5urkPOQo6MqA3pXWmJJL4Bb4rQ7yEYfxsjnIF29NJIoNC6e5fxOL/qN+9KCz7AaLpN8zI415N2i2EptpGrkRIjGeAuvR6IY1hSFLFUOug9Ms2M7ZxIUNytm1mnME186sdI2BOCwAyQMg54ugzSmKmwbPwSbolKH+hbAtQdsOoF+BsF3anUVwBdiOWRidFZDKTTrKEAJTm3GVrGkHzw/uPZbyx7DNNLfB7KGmRsCcr+/gjaiPSpAOTyX9qG4L/XBDdWXDDf1M+wtQ5fwCOtcb4Dto6VpLmzByB6gqdHbTItGSJdAGqibJQhmRfCF7IN4beSF2G9CqnGXQrxofXU+EykllNeoczRgYytDKMubDIRK0g5MF8rE69cGu0u9nlUcqaUZ41W0qK2nGcSzr4D2wV9U9wxp1rnpxn8agXAOHMQ9cy9kbHM7ngY4gFb03TxrO/yfBUifTtXt78jCrjY/jgEFnMn45LuNWUtknuu7NSm7D3QEn3HbatV1Q2jvgIRf1sfODKQaeymxZoMLlTqsq1LF+HvaTqQOzEzUCfni0/eNIA+DfuE3KEtbsegckGmMktTXacnBHPVe687ugkpT+axCkkhBSyRSjWI2xf1KMMVmYiQdWksK9BEFiQoiYLIlvJA3/zeTzCejP0RbB6YPbhZuB+0pR3KcdX0LaJtju0ZgBL8Bd+sbz2QIaU2OfBX3BaQLsgZysQtrk0M8Sh1A0w3DyyYnGnAiZ4gqZ/TvI2A8OGd1YIbF7+F3P+B6dYpYdsJNZgrjO0UdOIhmom0nwL0pnfnzkL1803jAoKhvyAAAAAElFTkSuQmCC\" /\u003e\n  \u003c/a\u003e\n \u003c/p\u003e\n\u003chr/\u003e\n\n## Powerful Machine Learning library for Deno\n\n## Installation\n\nThere is no installation step required. You can simply import the library and\nyou're good to go :)\n\n## Features\n\n- Lightweight and easy-to-use neural network library for\n  [Deno](https://deno.com).\n- Blazingly fast and efficient.\n- Provides a simple API for creating and training neural networks.\n- Can run on both the CPU and the GPU (WIP).\n- Allows you to simply run the code without downloading any prior dependencies.\n- Perfect for serverless environments.\n- Allows you to quickly build and deploy machine learning models for a variety\n  of applications with just a few lines of code.\n- Suitable for both beginners and experienced machine learning practitioners.\n\n### Backends\n\n- [CPU](./src/backends/cpu/) - Native backend written in Rust.\n- [WASM](./src/backends/wasm/) - WebAssembly backend written in Rust.\n- [GPU](./src/backends/gpu/) (TODO)\n\n### Examples\n\n- XOR ([CPU](./examples/xor_cpu.ts), [WASM](./examples/xor_wasm.ts))\n- Linear Regression ([CPU](./examples/linear_cpu.ts),\n  [WASM](./examples/linear_wasm.ts))\n- Filters ([CPU](./examples/filters/conv.ts),\n  [WASM](./examples/filters/conv_wasm.ts))\n- Mnist ([CPU](./examples/mnist), [WASM](./examples/mnist))\n\n### Maintainers\n\n- Dean Srebnik ([@load1n9](https://github.com/load1n9))\n- CarrotzRule ([@carrotzrule123](https://github.com/CarrotzRule123))\n- Pranev ([@retraigo](https://github.com/retraigo))\n\n### QuickStart\n\nThis example shows how to train a neural network to predict the output of the\nXOR function our speedy CPU backend written in\n[Rust](https://www.rust-lang.org/).\n\n```typescript\nimport {\n  Cost,\n  CPU,\n  DenseLayer,\n  Sequential,\n  setupBackend,\n  SigmoidLayer,\n  tensor2D,\n} from \"jsr:@denosaurs/netsaur\";\n\n/**\n * Setup the CPU backend. This backend is fast but doesn't work on the Edge.\n */\nawait setupBackend(CPU);\n\n/**\n * Creates a sequential neural network.\n */\nconst net = new Sequential({\n  /**\n   * The number of minibatches is set to 4 and the output size is set to 2.\n   */\n  size: [4, 2],\n\n  /**\n   * The silent option is set to true, which means that the network will not output any logs during trainin\n   */\n  silent: true,\n\n  /**\n   * Defines the layers of a neural network in the XOR function example.\n   * The neural network has two input neurons and one output neuron.\n   * The layers are defined as follows:\n   * - A dense layer with 3 neurons.\n   * - sigmoid activation layer.\n   * - A dense layer with 1 neuron.\n   * -A sigmoid activation layer.\n   */\n  layers: [\n    DenseLayer({ size: [3] }),\n    SigmoidLayer(),\n    DenseLayer({ size: [1] }),\n    SigmoidLayer(),\n  ],\n\n  /**\n   * The cost function used for training the network is the mean squared error (MSE).\n   */\n  cost: Cost.MSE,\n});\n\n/**\n * Train the network on the given data.\n */\nnet.train(\n  [\n    {\n      inputs: tensor2D([\n        [0, 0],\n        [1, 0],\n        [0, 1],\n        [1, 1],\n      ]),\n      outputs: tensor2D([[0], [1], [1], [0]]),\n    },\n  ],\n  /**\n   * The number of iterations is set to 10000.\n   */\n  10000,\n);\n\n/**\n * Predict the output of the XOR function for the given inputs.\n */\nconst out1 = (await net.predict(tensor1D([0, 0]))).data;\nconsole.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);\n\nconst out2 = (await net.predict(tensor1D([1, 0]))).data;\nconsole.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);\n\nconst out3 = (await net.predict(tensor1D([0, 1]))).data;\nconsole.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);\n\nconst out4 = (await net.predict(tensor1D([1, 1]))).data;\nconsole.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);\n```\n\n#### Use the WASM Backend\n\nBy changing the CPU backend to the WASM backend we sacrifice some speed but this\nallows us to run on the edge.\n\n```typescript\nimport {\n  Cost,\n  DenseLayer,\n  Sequential,\n  setupBackend,\n  SigmoidLayer,\n  tensor1D,\n  tensor2D,\n  WASM,\n} from \"jsr:@denosaurs/netsaur\";\n\n/**\n * Setup the WASM backend. This backend is slower than the CPU backend but works on the Edge.\n */\nawait setupBackend(WASM);\n\n/**\n * Creates a sequential neural network.\n */\nconst net = new Sequential({\n  /**\n   * The number of minibatches is set to 4 and the output size is set to 2.\n   */\n  size: [4, 2],\n\n  /**\n   * The silent option is set to true, which means that the network will not output any logs during trainin\n   */\n  silent: true,\n\n  /**\n   * Defines the layers of a neural network in the XOR function example.\n   * The neural network has two input neurons and one output neuron.\n   * The layers are defined as follows:\n   * - A dense layer with 3 neurons.\n   * - sigmoid activation layer.\n   * - A dense layer with 1 neuron.\n   * -A sigmoid activation layer.\n   */\n  layers: [\n    DenseLayer({ size: [3] }),\n    SigmoidLayer(),\n    DenseLayer({ size: [1] }),\n    SigmoidLayer(),\n  ],\n\n  /**\n   * The cost function used for training the network is the mean squared error (MSE).\n   */\n  cost: Cost.MSE,\n});\n\n/**\n * Train the network on the given data.\n */\nnet.train(\n  [\n    {\n      inputs: tensor2D([\n        [0, 0],\n        [1, 0],\n        [0, 1],\n        [1, 1],\n      ]),\n      outputs: tensor2D([[0], [1], [1], [0]]),\n    },\n  ],\n  /**\n   * The number of iterations is set to 10000.\n   */\n  10000,\n);\n\n/**\n * Predict the output of the XOR function for the given inputs.\n */\nconst out1 = (await net.predict(tensor1D([0, 0]))).data;\nconsole.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);\n\nconst out2 = (await net.predict(tensor1D([1, 0]))).data;\nconsole.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);\n\nconst out3 = (await net.predict(tensor1D([0, 1]))).data;\nconsole.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);\n\nconst out4 = (await net.predict(tensor1D([1, 1]))).data;\nconsole.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);\n```\n\n### Documentation\n\nThe full documentation for Netsaur can be found\n[here](https://deno.land/x/netsaur/mod.ts).\n\n### License\n\nNetsaur is licensed under the MIT License.\n","funding_links":[],"categories":["Rust","Modules"],"sub_categories":["Machine learning"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdenosaurs%2Fnetsaur","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdenosaurs%2Fnetsaur","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdenosaurs%2Fnetsaur/lists"}