Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
#

llamacpp

Here are 702 public repositories matching this topic...

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer.

  • UpdatedFeb 19, 2026
  • TypeScript
khoj

Your AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (gpt, claude, gemini, llama, qwen, mistral). Get started - free.

  • UpdatedJan 6, 2026
  • Python

Unified framework for building enterprise RAG pipelines with small, specialized models

  • UpdatedFeb 19, 2026
  • Python

A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!

  • UpdatedApr 23, 2024
  • TypeScript

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.

  • UpdatedFeb 20, 2026
  • C++

Swap GPT for any LLM by changing a single line of code. Xinference lets you run open-source, speech, and multimodal models on cloud, on-prem, or your laptop — all through one unified, production-ready inference API.

  • UpdatedFeb 20, 2026
  • Python
runanywhere-sdks

Private & local AI personal knowledge management app for high entropy people.

  • UpdatedMay 13, 2025
  • JavaScript
serge

A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.

  • UpdatedNov 21, 2025
  • Svelte

Low-latency AI inference engine for mobile devices & wearables

  • UpdatedFeb 20, 2026
  • C

Your agent in your terminal, equipped with local tools: writes code, uses the terminal, browses the web. Make your own persistent autonomous agent on top!

  • UpdatedFeb 20, 2026
  • Python
spark-nlp

⚡ Python-free Rust inference server — OpenAI-API compatible. GGUF + SafeTensors, hot model swap, auto-discovery, single binary. FREE now, FREE forever.

  • UpdatedJan 16, 2026
  • Rust

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.

  • UpdatedAug 7, 2025
  • TypeScript

A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.

  • UpdatedFeb 17, 2026
  • C#

AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.

  • UpdatedFeb 20, 2026
  • Python
lsp-ai

LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.

  • UpdatedJan 7, 2025
  • Rust

Local AI API Platform

  • UpdatedJul 4, 2025
  • C++

RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.

  • UpdatedFeb 20, 2026
  • Python

Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc

  • UpdatedFeb 20, 2026
  • Go

Improve this page

Add a description, image, and links to thellamacpp topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with thellamacpp topic, visit your repo's landing page and select "manage topics."

Learn more


[8]ページ先頭

©2009-2026 Movatter.jp