- Notifications
You must be signed in to change notification settings - Fork1.5k
Self-hosted AI coding assistant
License
TabbyML/tabby
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:
- Self-contained, with no need for a DBMS or cloud service.
- OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
- Supports consumer-grade GPUs.
- 07/02/2025v0.30 supports indexing GitLab Merge Request as Context!
- 05/25/2025 💡Interested in joiningAgent private preview? DM inX for early waitlist approval!🎫
- 05/20/2025 Enhance Tabby with your own documentation📃 through REST APIs inv0.29! 🎉
- 05/01/2025v0.28 transforming Answer Engine messages into persistent, shareable Pages
- 03/31/2025v0.27 released with a richer
@
menu in the chat side panel. - 02/05/2025 LDAP Authentication and better notification for background jobs coming in Tabbyv0.24.0!✨
- 02/04/2025VSCode 1.20.0 upgrade! @-mention files to add them as chat context, and edit inline with a new right-click option are available!
Archived
- 01/10/2025 Tabbyv0.23.0 featuring enhanced code browser experience and chat side panel improvements!
- 12/24/2024 IntroduceNotification Box in Tabbyv0.22.0!
- 12/06/2024 Llamafile deployment integration and enhanced Answer Engine user experience are coming in Tabbyv0.21.0!🚀
- 11/10/2024 Switching between different backend chat models is supported in Answer Engine with Tabbyv0.20.0!
- 10/30/2024 Tabbyv0.19.0 featuring recent shared threads on the main page to improve their discoverability.
- 07/09/2024 🎉AnnounceCodestral integration in Tabby!
- 07/05/2024 Tabbyv0.13.0 introducesAnswer Engine, a central knowledge engine for internal engineering teams. It seamlessly integrates with dev team's internal data, delivering reliable and precise answers to empower developers.
- 06/13/2024VSCode 1.7 marks a significant milestone with a versatile Chat experience throughout your coding experience. Come and they the latestchat in side-panel andediting via chat command!
- 06/10/2024 Latest 📃blogpost drop onan enhanced code context understanding in Tabby!
- 06/06/2024 Tabbyv0.12.0 release brings 🔗seamless integrations (Gitlab SSO, Self-hosted GitHub/GitLab, etc.), to ⚙️flexible configurations (HTTP API integration) and 🌐expanded capabilities (repo-context in Code Browser)!
- 05/22/2024 TabbyVSCode 1.6 comes withmultiple choices in inline completion, and theauto-generated commit messages🐱💻!
- 05/11/2024v0.11.0 brings significant enterprise upgrades, including 📊storage usage stats, 🔗GitHub & GitLab integration, 📋Activities page, and the long-awaited 🤖Ask Tabby feature!
- 04/22/2024v0.10.0 released, featuring the latestReports tab with team-wise analytics for Tabby usage.
- 04/19/2024 📣 Tabby now incorporateslocally relevant snippets(declarations from local LSP, and recently modified code) for code completion!
- 04/17/2024 CodeGemma and CodeQwen model series have now been added to theofficial registry!
- 03/20/2024v0.9 released, highlighting a full feature admin UI.
- 12/23/2023 Seamlesslydeploy Tabby on any cloud withSkyServe 🛫 from SkyPilot.
- 12/15/2023v0.7.0 released with team management and secured access!
- 10/15/2023 RAG-based code completion is enabled by detail inv0.3.0🎉! Check out theblogpost explaining how Tabby utilizes repo-level context to get even smarter!
- 11/27/2023v0.6.0 released!
- 11/09/2023v0.5.5 released! With a redesign of UI + performance improvement.
- 10/24/2023 ⛳️ Major updates for Tabby IDE plugins acrossVSCode/Vim/IntelliJ!
- 10/04/2023 Check out themodel directory for the latest models supported by Tabby.
- 09/18/2023 Apple's M1/M2 Metal inference support has landed inv0.1.1!
- 08/31/2023 Tabby's first stable releasev0.0.1 🥳.
- 08/28/2023 Experimental support for theCodeLlama 7B.
- 08/24/2023 Tabby is now onJetBrains Marketplace!
You can find our documentationhere.
The easiest way to start a Tabby server is by using the following Docker command:
docker run -it \ --gpus all -p 8080:8080 -v$HOME/.tabby:/data \ tabbyml/tabby \ serve --model StarCoder-1B --device cuda --chat-model Qwen2-1.5B-Instruct
For additional options (e.g inference type, parallelism), please refer to thedocumentation page.
Full guide atCONTRIBUTING.md;
git clone --recurse-submodules https://github.com/TabbyML/tabbycd tabby
If you have already cloned the repository, you could run thegit submodule update --recursive --init
command to fetch all submodules.
Set up the Rust environment by following thistutorial.
Install the required dependencies:
# For MacOSbrew install protobuf# For Ubuntu / Debianapt install protobuf-compiler libopenblas-dev
- Install useful tools:
# For Ubuntuapt install make sqlite3 graphviz
- Now, you can build Tabby by running the command
cargo build
.
... and don't forget to submit aPull Request
- 🎤Twitter / X - engage with TabbyML for all things possible
- 📚LinkedIn - follow for the latest from the community
- 💌Newsletter - subscribe to unlock Tabby insights and secrets
About
Self-hosted AI coding assistant
Topics
Resources
License
Code of conduct
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.