- Notifications
You must be signed in to change notification settings - Fork6
Scala 3 bindings for llama.cpp 🦙
License
NotificationsYou must be signed in to change notification settings
donderom/llm4s
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Experimental Scala 3 bindings forllama.cpp usingSlinc.
Addllm4s
to yourbuild.sbt
:
libraryDependencies+="com.donderom"%%"llm4s"%"0.12.0-b4599"
For JDK 17 add.jvmopts
file in the project root:
--add-modules=jdk.incubator.foreign--enable-native-access=ALL-UNNAMED
- Scala: 3.3.0
- JDK: 17 or 19
llama.cpp
: The version suffix refers to the latest supportedllama.cpp
release (e.g. version0.12.0-b4599
means that it supports theb4599 release).
Older versions
llm4s | Scala | JDK | llama.cpp (commit hash) |
---|---|---|---|
0.11+ | 3.3.0 | 17, 19 | 229ffff (May 8, 2024) |
0.10+ | 3.3.0 | 17, 19 | 49e7cb5 (Jul 31, 2023) |
0.6+ | 3.3.0-RC3 | --- | 49e7cb5 (Jul 31, 2023) |
0.4+ | 3.3.0-RC3 | --- | 70d26ac (Jul 23, 2023) |
0.3+ | 3.3.0-RC3 | --- | a6803ca (Jul 14, 2023) |
0.1+ | 3.3.0-RC3 | 17, 19 | 447ccbe (Jun 25, 2023) |
importjava.nio.file.Pathsimportcom.donderom.llm4s.*// Path to the llama.cpp shared librarySystem.load("./build/bin/libllama.dylib")// Path to the model supported by llama.cppvalmodel=Paths.get("Llama-3.2-3B-Instruct-Q6_K.gguf")valprompt="What is LLM?"
valllm=Llm(model)// To print generation as it goesllm(prompt).foreach: stream=> stream.foreach: token=> print(token)// Or build a stringllm(prompt).foreach(stream=> println(stream.mkString))llm.close()
valllm=Llm(model)llm.embeddings(prompt).foreach: embeddings=> embeddings.foreach: embd=> print(embd) print(' ')llm.close()
Self-containedScala CLI example:
Run.scala
:
//>usingscala3.3.0//>usingjvmadoptium:17//>usingjava-opt--add-modules=jdk.incubator.foreign//>usingjava-opt--enable-native-access=ALL-UNNAMED//>usingdepcom.donderom::llm4s:0.12.0-b4599importcom.donderom.llm4s.Llmimportjava.nio.file.Pathsimportscala.util.UsingobjectMainextendsApp:System.load("./build/bin/libllama.dylib")valmodel=Paths.get("Llama-3.2-3B-Instruct-Q6_K.gguf")valprompt="What is LLM?"Using(Llm(model)): llm=>// llm : com.donderom.llm4s.Llm llm(prompt).foreach: stream=>// stream : LazyList[String] stream.foreach: token=>// token : String print(token)
scala-cli Run.scala
About
Scala 3 bindings for llama.cpp 🦙