https://github.com/niansa/libjustlm
Super easy to use library for doing LLaMA/GPT-J stuff! - Mirror of: https://gitlab.com/niansa/libjustlm
https://github.com/niansa/libjustlm
ai cpp17 cpp20 gpt-j llama llama2 llm llm-inference mpt python wrapper-library
Last synced: 3 months ago
JSON representation
Super easy to use library for doing LLaMA/GPT-J stuff! - Mirror of: https://gitlab.com/niansa/libjustlm
- Host: GitHub
- URL: https://github.com/niansa/libjustlm
- Owner: niansa
- License: mit
- Created: 2023-07-13T14:43:35.000Z (almost 2 years ago)
- Default Branch: master
- Last Pushed: 2024-03-25T00:24:24.000Z (about 1 year ago)
- Last Synced: 2025-02-24T09:38:16.550Z (3 months ago)
- Topics: ai, cpp17, cpp20, gpt-j, llama, llama2, llm, llm-inference, mpt, python, wrapper-library
- Language: C++
- Homepage:
- Size: 398 KB
- Stars: 2
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# JustLM
Super easy to use library for doing LLaMA/GPT-J/MPT stuff!## Overview
This library implements an easy to use interface to LLaMa, GPT-J and MPT, with optional Python bindings.Context scrolling is automatic and supports a top window bar.
Additionally, "pooling" is implemented to support keeping `x` inference instances in RAM and automatically moving least recently used ones to disk, ready for retrieval.
## Documentation
Literally, just read the 2 header files in `include/`! The interface couldn't be simpler.## Credits
Thanks to *Georgi Gerganov (ggerganov)* for having written `ggml` and `llama.cpp` C libraries, which are both extremely important parts of this project!
Also thanks to *Nomic AI* for having heavily helped me drive this project forward.