Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rustai-solutions/candle_demo_openchat_35
candle_demo_openchat_35
https://github.com/rustai-solutions/candle_demo_openchat_35
Last synced: about 1 month ago
JSON representation
candle_demo_openchat_35
- Host: GitHub
- URL: https://github.com/rustai-solutions/candle_demo_openchat_35
- Owner: rustai-solutions
- Created: 2023-11-29T08:41:52.000Z (about 1 year ago)
- Default Branch: master
- Last Pushed: 2023-11-30T10:02:25.000Z (about 1 year ago)
- Last Synced: 2024-08-03T01:17:12.497Z (4 months ago)
- Language: Rust
- Size: 16.6 KB
- Stars: 11
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-llm-and-aigc - rustai-solutions/candle_demo_openchat_35 - solutions/candle_demo_openchat_35?style=social"/> : candle_demo_openchat_35. (Summary)
- awesome-llm-and-aigc - rustai-solutions/candle_demo_openchat_35 - solutions/candle_demo_openchat_35?style=social"/> : candle_demo_openchat_35. (Summary)
- awesome-cuda-and-hpc - rustai-solutions/candle_demo_openchat_35 - solutions/candle_demo_openchat_35?style=social"/> : candle_demo_openchat_35. (Frameworks)
- awesome-cuda-and-hpc - rustai-solutions/candle_demo_openchat_35 - solutions/candle_demo_openchat_35?style=social"/> : candle_demo_openchat_35. (Frameworks)
- awesome-rust-list - rustai-solutions/candle_demo_openchat_35 - solutions/candle_demo_openchat_35?style=social"/> : candle_demo_openchat_35. (Machine Learning)
- awesome-rust-list - rustai-solutions/candle_demo_openchat_35 - solutions/candle_demo_openchat_35?style=social"/> : candle_demo_openchat_35. (Machine Learning)
README
## Rust Candle Demo
An interactive command line tool to demonstrate how to use HuggingFace's rust [Candle ML framework](https://github.com/huggingface/candle) to execute LLM.
This demo uses the quantized version of LLM openchat: https://huggingface.co/TheBloke/openchat_3.5-GGUF by default.
### Prepare
Make sure you have installed the huggingface cli, if not, do it:
```
pip install -U "huggingface_hub[cli]"
```And then you should download this model file associated with the original openchat `tokenizer.json` file:
```
mkdir hf_hub
HF_HUB_ENABLE_HF_TRANSFER=1 HF_ENDPOINT=https://hf-mirror.com huggingface-cli download TheBloke/openchat_3.5-GGUF openchat_3.5.Q8_0.gguf --local-dir hf_hub
HF_HUB_ENABLE_HF_TRANSFER=1 HF_ENDPOINT=https://hf-mirror.com huggingface-cli download openchat/openchat_3.5 tokenizer.json --local-dir hf_hub
```### Run
There are two examples here:
- **simple**: all parameters are hardcoded into code to make everything simplest, but you need to modify the model and tokenizer.json file by yourself, and run by:
```
cargo run --release --bin simple
```- **cli**: you can use this cli program to pass parameters from command line.
```
cargo run --release --bin cli -- --model=xxxxxxx --tokenizer=xxxx
```You can use `--help` to show what parameters could be configured.
```
$ cargo run --release --bin cli -- --help
Finished release [optimized] target(s) in 0.04s
Running `target/release/cli --help`
avx: false, neon: false, simd128: false, f16c: false
Usage: cli [OPTIONS]Options:
--tokenizer [default: ../hf_hub/openchat_3.5_tokenizer.json]
--model [default: ../hf_hub/openchat_3.5.Q8_0.gguf]
-n, --sample-len [default: 1000]
--temperature [default: 0.8]
--seed [default: 299792458]
--repeat-penalty [default: 1.1]
--repeat-last-n [default: 64]
--gqa [default: 8]
-h, --help Print help
-V, --version Print version
```### License
None.
### Feedback
Feel free to submit issues to this repository.