Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/donderom/llm4s
Scala 3 bindings for llama.cpp
https://github.com/donderom/llm4s
llama llm nlp scala
Last synced: about 1 month ago
JSON representation
Scala 3 bindings for llama.cpp
- Host: GitHub
- URL: https://github.com/donderom/llm4s
- Owner: donderom
- License: apache-2.0
- Created: 2023-06-26T18:41:24.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-05-30T17:27:08.000Z (7 months ago)
- Last Synced: 2024-10-29T09:00:30.913Z (2 months ago)
- Topics: llama, llm, nlp, scala
- Language: Scala
- Homepage:
- Size: 114 KB
- Stars: 48
- Watchers: 4
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## llm4s
![Sonatype Nexus (Releases)](https://img.shields.io/nexus/r/com.donderom/llm4s_3?server=https%3A%2F%2Fs01.oss.sonatype.org&style=flat&color=dbf1ff)
*Experimental* Scala 3 bindings for [llama.cpp](https://github.com/ggerganov/llama.cpp) using [Slinc](https://github.com/scala-interop/slinc).
### Setup
Add `llm4s` to your `build.sbt`:
```scala
libraryDependencies += "com.donderom" %% "llm4s" % "0.11.0"
```For JDK 17 add `.jvmopts` file in the project root:
```
--add-modules=jdk.incubator.foreign
--enable-native-access=ALL-UNNAMED
```Version compatibility:
| llm4s | Scala | JDK | llama.cpp (commit hash) |
|------:|------:|-------:|------------------------:|
| 0.11+ | 3.3.0 | 17, 19 | 229ffff (May 8, 2024) |Older versions
| llm4s | Scala | JDK | llama.cpp (commit hash) |
|------:|----------:|-------:|------------------------:|
| 0.10+ | 3.3.0 | 17, 19 | 49e7cb5 (Jul 31, 2023) |
| 0.6+ | --- | --- | 49e7cb5 (Jul 31, 2023) |
| 0.4+ | --- | --- | 70d26ac (Jul 23, 2023) |
| 0.3+ | --- | --- | a6803ca (Jul 14, 2023) |
| 0.1+ | 3.3.0-RC3 | 17, 19 | 447ccbe (Jun 25, 2023) |### Usage
```scala
import java.nio.file.Paths
import com.donderom.llm4s.*// Path to the llama.cpp shared library
System.load("llama.cpp/libllama.so")// Path to the model supported by llama.cpp
val model = Paths.get("models/llama-7b-v2/llama-2-7b.Q4_K_M.gguf")
val prompt = "Large Language Model is"
```#### Completion
```scala
val llm = Llm(model)// To print generation as it goes
llm(prompt).foreach: stream =>
stream.foreach: token =>
print(token)// Or build a string
llm(prompt).foreach(stream => println(stream.mkString))llm.close()
```#### Embeddings
```scala
val llm = Llm(model)
llm.embeddings(prompt).foreach: embeddings =>
embeddings.foreach: embd =>
print(embd)
print(' ')
llm.close()
```