Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mattzcarey/llama.js
run LLMs (llama, mamba, nemo, mistral) at native speeds from Javascript, Typescript.
https://github.com/mattzcarey/llama.js
ai bun javascript js llama llamacpp llms machinelearning pytorch tensorflowjs ts typescript zig
Last synced: about 2 hours ago
JSON representation
run LLMs (llama, mamba, nemo, mistral) at native speeds from Javascript, Typescript.
- Host: GitHub
- URL: https://github.com/mattzcarey/llama.js
- Owner: mattzcarey
- Created: 2024-08-29T14:02:04.000Z (28 days ago)
- Default Branch: main
- Last Pushed: 2024-08-29T14:03:29.000Z (28 days ago)
- Last Synced: 2024-09-22T06:02:13.524Z (5 days ago)
- Topics: ai, bun, javascript, js, llama, llamacpp, llms, machinelearning, pytorch, tensorflowjs, ts, typescript, zig
- Language: TypeScript
- Homepage:
- Size: 3.91 KB
- Stars: 5
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# llama.js
> an experiment to run llama.cpp through a javascript runtime at near native speeds
## Installation
- Clone the repo recursively
- Install zig to the path
- cd into llama.cpp.zig- Download a model
```bash
huggingface-cli download NousResearch Hermes-2-Pro-Mistral-7B-GGUF Hermes-2-Pro-Mistral-7B.Q4_0.gguf --local-dir models
```- Run the model with llama.cpp.zig
```bash
zig build run-simple -Doptimize=ReleaseFast -- --model_path "./models/Hermes-2-Pro-Mistral-7B.Q4_0.gguf" --prompt "Hello! I am AI, and here are the 10 things I like to think about:"
```- Build the library (zig bindings)
```bash
zig build
```## Usage
- cd back to root and run the index.ts
```bash
bun run index.ts
```