open-source-llm
Here are 24 public repositories matching this topic...
Sort:Most stars
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
- Updated
Jul 14, 2025 - Python
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
- Updated
Nov 6, 2023 - Python
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
- Updated
Jan 19, 2025 - Jupyter Notebook
LLM (Large Language Model) FineTuning
- Updated
Apr 1, 2025 - Jupyter Notebook
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
- Updated
Jul 18, 2025 - Go
LLMs and Machine Learning done easily
- Updated
Jun 21, 2025 - Python
A list of LLMs Tools & Projects
- Updated
Jul 8, 2025
This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience.
- Updated
Jun 12, 2025 - PHP
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
- Updated
Jul 2, 2025 - Rust
Samples on how to build industry solution leveraging generative AI capabilities on top of SAP BTP and integrated with SAP S/4HANA Cloud.
- Updated
Jun 12, 2025 - HTML
EmbeddedLLM: API server for Embedded Device Deployment. Currently support CUDA/OpenVINO/IpexLLM/DirectML/CPU
- Updated
Oct 6, 2024 - Python
Fine-tune open-source Language Models (LLMs) on E-commerce data, leveraging Amazon's sales data. Showcase a tailored solution for enhanced language understanding and generation with a focus on custom E-commerce datasets.
- Updated
Feb 8, 2024 - Jupyter Notebook
Read your local files and answer your queries
- Updated
Sep 3, 2024 - Python
Multi-agent workflows with Llama3: A private on-device multi-agent framework
- Updated
Jun 5, 2024 - Python
LocalPrompt is an AI-powered tool designed to refine and optimize AI prompts, helping users run locally hosted AI models like Mistral-7B for privacy and efficiency. Ideal for developers seeking to run LLMs locally without external APIs.
- Updated
Mar 23, 2025
In this project, we leverage Weaviate, a vector database, to power our retrieval-augmented generation (RAG) application. Weaviate enables efficient vector similarity search, which is crucial for building effective RAG systems. Additionally, we use local language model (LLM) and embedding models.
- Updated
Jun 3, 2024 - Jupyter Notebook
This project contains the code and documentation for an autonomous AI agent that classifies, enriches, and scores inbound business leads. It is built with a FastAPI backend, a LangGraph agent workflow powered by a local Ollama LLM, and a Streamlit frontend for demonstration.
- Updated
Jun 25, 2025 - Python
*the-stix-intern* a minimalistic framework for the automized extraction of CTI from unstructured texts
- Updated
May 15, 2025 - HTML
LocalPrompt is an AI-powered tool designed to refine and optimize AI prompts, helping users run locally hosted AI models like Mistral-7B for privacy and efficiency. Ideal for developers seeking to run LLMs locally without external APIs.
- Updated
Feb 10, 2025 - Python
Gittxt is an AI-focused CLI and plugin tool for extracting, filtering, and packaging text from GitHub repos. Build LLM-compatible datasets, prep code for prompt engineering, and power AI workflows with structured .txt, .json, .md, or .zip outputs.
- Updated
May 16, 2025 - Python
Improve this page
Add a description, image, and links to theopen-source-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with theopen-source-llm topic, visit your repo's landing page and select "manage topics."