Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/developer239/llama-chat-cmake-example
https://github.com/developer239/llama-chat-cmake-example
Last synced: 11 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/developer239/llama-chat-cmake-example
- Owner: developer239
- Created: 2024-07-27T14:54:43.000Z (4 months ago)
- Default Branch: master
- Last Pushed: 2024-09-26T22:22:05.000Z (about 2 months ago)
- Last Synced: 2024-11-09T15:16:55.485Z (11 days ago)
- Language: C++
- Size: 8.79 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# LlamaChat CMake Example
This project demonstrates how to use the [LlamaChat](https://github.com/developer239/llama-chat) library in a C++ application. It shows how to initialize the model, set up the context, and run multiple queries using the LlamaChat interface.
## Project Structure
```
llamachat-example/
│
├── CMakeLists.txt
├── src/
│ └── main.cpp
├── models/
│ └── Meta-Llama-3.1-8B-Instruct-Q3_K_S.gguf
└── externals/
└── llama-chat/
```## Setup
1. Clone this repository:
```
git clone https://github.com/your-username/llamachat-example.git
cd llamachat-example
```2. Initialize and update the LlamaChat submodule:
```
git submodule update --init --recursive
```3. Place your language model file (e.g., Meta-Llama-3.1-8B-Instruct-Q3_K_S.gguf) in the `models/` directory.
4. Create a build directory and run CMake:
```
mkdir build
cd build
cmake ..
```5. Build the project:
```
cmake --build .
```## Usage
After building the project, you can run the example application:
```
./llamachat_example
```The application will initialize the LlamaChat model, set a system prompt, and then ask a series of predefined questions. The responses will be printed to the console.