Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
#

llamacpp

Here are 632 public repositories matching this topic...

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer.

  • UpdatedDec 17, 2025
  • TypeScript
khoj

Your AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (gpt, claude, gemini, llama, qwen, mistral). Get started - free.

  • UpdatedDec 8, 2025
  • Python

Unified framework for building enterprise RAG pipelines with small, specialized models

  • UpdatedJul 24, 2025
  • Python

A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!

  • UpdatedApr 23, 2024
  • TypeScript

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.

  • UpdatedDec 17, 2025
  • C++

Swap GPT for any LLM by changing a single line of code. Xinference lets you run open-source, speech, and multimodal models on cloud, on-prem, or your laptop — all through one unified, production-ready inference API.

  • UpdatedDec 17, 2025
  • Python

Private & local AI personal knowledge management app for high entropy people.

  • UpdatedMay 13, 2025
  • JavaScript
serge

A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.

  • UpdatedNov 21, 2025
  • Svelte

Your agent in your terminal, equipped with local tools: writes code, uses the terminal, browses the web, vision.

  • UpdatedDec 17, 2025
  • Python
spark-nlp

Kernels & AI inference engine for mobile devices.

  • UpdatedDec 17, 2025
  • C++
runanywhere-sdks

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.

  • UpdatedAug 7, 2025
  • TypeScript

⚡ Python-free Rust inference server — OpenAI-API compatible. GGUF + SafeTensors, hot model swap, auto-discovery, single binary. FREE now, FREE forever.

  • UpdatedDec 17, 2025
  • Rust

A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.

  • UpdatedDec 1, 2025
  • C#

AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.

  • UpdatedDec 16, 2025
  • Python
lsp-ai

LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.

  • UpdatedJan 7, 2025
  • Rust

Local AI API Platform

  • UpdatedJul 4, 2025
  • C++

RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.

  • UpdatedDec 17, 2025
  • Python
maid

Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.

  • UpdatedJul 28, 2025
  • Dart

Improve this page

Add a description, image, and links to thellamacpp topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with thellamacpp topic, visit your repo's landing page and select "manage topics."

Learn more


[8]ページ先頭

©2009-2025 Movatter.jp