ollama-api
Here are 478 public repositories matching this topic...
Language:All
Sort:Most stars
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
- Updated
Aug 7, 2025 - TypeScript
⚡ Python-free Rust inference server — OpenAI-API compatible. GGUF + SafeTensors, hot model swap, auto-discovery, single binary. FREE now, FREE forever.
- Updated
Dec 17, 2025 - Rust
🦙 Local and online AI hub
- Updated
Dec 16, 2025 - Python
The easiest way to use Ollama in .NET
- Updated
Dec 5, 2025 - C#
Chat app for Android that supports answers from multiple LLMs at once. Bring your own API key AI client. Supports OpenAI, Anthropic, Google, and Ollama. Designed with Material3 & Compose.
- Updated
Jul 24, 2025 - Kotlin
✨ AI interface for tinkerers (Ollama, Haystack RAG, Python)
- Updated
Sep 11, 2025 - Python
Assistant: AI Helper Plugin for KOReader : lets you interact with AI language models (Claude, GPT-4, Gemini, DeepSeek, Ollama etc.) while reading
- Updated
Nov 28, 2025 - Lua
A Ruby gem for interacting with Ollama's API that allows you to run open source AI LLMs (Large Language Models) locally.
- Updated
Jul 21, 2024 - Ruby
Add AI capabilities to your file system using Ollama, Groq, OpenAi and other's api
- Updated
Jan 4, 2025 - TypeScript
ThunderAI is a Thunderbird Addon that uses the capabilities of ChatGPT, Gemini, Claude or Ollama to enhance email management.
- Updated
Dec 14, 2025 - JavaScript
This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience.
- Updated
Jun 12, 2025 - PHP
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
- Updated
Dec 15, 2025 - MATLAB
Ollama负载均衡服务器 | 一款高性能、易配置的开源负载均衡服务器,优化Ollama负载。它能够帮助您提高应用程序的可用性和响应速度,同时确保系统资源的有效利用。
- Updated
Nov 6, 2025
Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel
- Updated
Sep 4, 2025 - Shell
A versatile multi-modal chat application that enables users to develop custom agents, create images, leverage visual recognition, and engage in voice interactions. It integrates seamlessly with local LLMs and commercial models like OpenAI, Gemini, Perplexity, and Claude, and allows to converse with uploaded documents and websites.
- Updated
Sep 4, 2024 - C#
Context-Engine: MCP retrieval stack for AI coding assistants. Hybrid code search (dense + lexical + reranker), ReFRAG micro-chunking, local LLM prompt enhancement, and dual SSE/RMCP endpoints. One command deploys Qdrant-powered indexing for Cursor, Windsurf, Roo, Cline, Codex, and any MCP client.
- Updated
Dec 18, 2025 - Python
Improve this page
Add a description, image, and links to theollama-api topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with theollama-api topic, visit your repo's landing page and select "manage topics."