https://github.com/iamrajhans/llm-inference-js
https://github.com/iamrajhans/llm-inference-js
gemma2 local-llm web
Last synced: 25 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/iamrajhans/llm-inference-js
- Owner: iamrajhans
- License: mit
- Created: 2024-11-11T10:50:00.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2024-11-26T11:21:49.000Z (5 months ago)
- Last Synced: 2025-02-03T07:45:21.891Z (3 months ago)
- Topics: gemma2, local-llm, web
- Language: JavaScript
- Homepage:
- Size: 7.81 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# llm-inference-js
### Prerequisite
A browser with WebGPU support
### Requirement
Download the gemma2 2b from [here](https://www.kaggle.com/models/google/gemma-2/tfLite/gemma2-2b-it-gpu-int8)
### Getting Started
run the development server:
```bash
python -m http.server 8000
```Open [http://localhost:8000](http://localhost:8000) with your browser to see the result.