- Notifications
You must be signed in to change notification settings - Fork13.9k
Home
Georgi Gerganov edited this pageJul 31, 2025 ·10 revisions
Welcome to the llama.cpp wiki!
yay -S llama.cppyay -S llama.cpp-cudayay -S llama.cpp-hipyay -S llama.cpp-vulkan
nix run github:ggerganov/llama.cppnix run'github:ggerganov/llama.cpp#opencl'{config,pkgs, ...}:{nixpkgs.config.packageOverrides=pkgs:{llama-cpp=(builtins.getFlake"github:ggerganov/llama.cpp").packages.${builtins.currentSystem}.default;};};environment.systemPackages=withpkgs;[llama-cpp]}
Waithttps://github.com/termux/termux-packages/pull/17457.
apt install llama-cpp
pacman -S llama-cpp
git clone --depth=1 https://github.com/ggerganov/llama.cppcd llama.cppcmake -Bbuildcmake --build build -D...cd buildcpack -G DEBdpkg -i*.deb
git clone --depth=1 https://github.com/ggerganov/llama.cppcd llama.cppcmake -Bbuildcmake --build build -D...cd buildcpack -G RPMrpm -i*.rpm
Useful information for users that doesn't fit into Readme.
- Home
- Feature Matrix
- GGML Tips & Tricks
- Chat Templating
- Metadata Override
- HuggingFace Model Card Metadata Interoperability Consideration
These are information useful for Maintainers and Developers which does not fit into code comments
Click on a badge to jump to workflow.This is here as a useful general view of all the actions so that we may notice quicker if main branch automation is broken and where.