Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork155
Chat with GitHub Copilot in Neovim
License
CopilotC-Nvim/CopilotChat.nvim
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
CopilotChat.nvim brings GitHub Copilot Chat capabilities directly into Neovim with a focus on transparency and user control.
- 🤖Multiple AI Models - GitHub Copilot (including GPT-4o, Gemini 2.5 Pro, Claude 4 Sonnet, Claude 3.7 Sonnet, Claude 3.5 Sonnet, o3-mini, o4-mini) + custom providers (Ollama, Mistral.ai). The exact list of available models depends on yourGitHub Copilot settings and the models provided by GitHub's API.
- 🔧Tool Calling - LLM can call workspace functions (file reading, git operations, search) with your explicit approval
- 🔒Privacy First - Only shares what you explicitly request - no background data collection
- 📝Interactive Chat - Interactive UI with completion, diffs, and quickfix integration
- 🎯Smart Prompts - Composable templates and sticky prompts for consistent context
- ⚡Token Efficient - Resource replacement prevents duplicate context, history management via tiktoken counting
- 🔗Scriptable - Comprehensive Lua API for automation and headless mode operation
- 🔌Extensible -Custom functions andproviders, plus integrations likemcphub.nvim
- Neovim 0.10.0+
- curl 8.0.0+
- Copilot chat in the IDE enabled in GitHub settings
- plenary.nvim
Warning
For Neovim < 0.11.0, addnoinsert ornoselect to yourcompleteopt otherwise chat autocompletion will not work.For best autocompletion experience, also addpopup to yourcompleteopt (even on Neovim 0.11.0+).
- tiktoken_core - For accurate token counting
- Arch Linux: Install
luajit-tiktoken-binorlua51-tiktoken-binfrom AUR - Via luarocks:
sudo luarocks install --lua-version 5.1 tiktoken_core - Manual: Download fromlua-tiktoken releases and save as
tiktoken_core.soin your Lua path
- Arch Linux: Install
- git - For git diff context features
- ripgrep - For improved search performance
- lynx - For improved URL context features
For various plugin pickers to work correctly, you need to replacevim.ui.select with your desired picker (as the defaultvim.ui.select is very basic). Here are some examples:
- fzf-lua - call
require('fzf-lua').register_ui_select() - telescope - setup
telescope-ui-select.nvimplugin - snacks.picker - enable
ui_selectconfig - mini.pick - set
vim.ui.select = require('mini.pick').ui_select
return { {"CopilotC-Nvim/CopilotChat.nvim",dependencies= { {"nvim-lua/plenary.nvim",branch="master"}, },build="make tiktoken",opts= {-- See Configuration section for options }, },}
callplug#begin()Plug'nvim-lua/plenary.nvim'Plug'CopilotC-Nvim/CopilotChat.nvim'callplug#end()lua << EOFrequire("CopilotChat").setup()EOF
- Resources (
#<name>) - Add specific content (files, git diffs, URLs) to your prompt - Tools (
@<name>) - Give LLM access to functions it can call with your approval - Sticky Prompts (
> <text>) - Persist context across single chat session - Models (
$<model>) - Specify which AI model to use for the chat - Prompts (
/PromptName) - Use predefined prompt templates for common tasks
#Add specific file to context#file:src/main.lua#Give LLM access to workspace tools@copilot What files are in this project?#Sticky prompt that persists>#buffer:active>You are a helpful coding assistant
When you use@copilot, the LLM can call functions likebash,edit,file,glob,grep,gitdiff etc. You'll see the proposed function call and can approve/reject it before execution.
| Command | Description |
|---|---|
:CopilotChat <input>? | Open chat with optional input |
:CopilotChatOpen | Open chat window |
:CopilotChatClose | Close chat window |
:CopilotChatToggle | Toggle chat window |
:CopilotChatStop | Stop current output |
:CopilotChatReset | Reset chat window |
:CopilotChatSave <name>? | Save chat history |
:CopilotChatLoad <name>? | Load chat history |
:CopilotChatPrompts | View/select prompt templates |
:CopilotChatModels | View/select available models |
:CopilotChat<PromptName> | Use specific prompt template |
| Insert | Normal | Action |
|---|---|---|
<Tab> | - | Trigger/accept completion menu for tokens |
<C-c> | q | Close the chat window |
<C-l> | <C-l> | Reset and clear the chat window |
<C-s> | <CR> | Submit the current prompt |
| - | grr | Toggle sticky prompt for line under cursor |
<C-y> | <C-y> | Accept nearest diff |
| - | gj | Jump to section of nearest diff |
| - | gqa | Add all answers from chat to quickfix list |
| - | gqd | Add all diffs from chat to quickfix list |
| - | gy | Yank nearest diff to register |
| - | gd | Show diff between source and nearest diff |
| - | gc | Show info about current chat |
| - | gh | Show help message |
Warning
Some plugins (e.g.copilot.vim) may also map common keys like<Tab> in insert mode.
To avoid conflicts, disable Copilot's default<Tab> mapping with:
vim.g.copilot_no_tab_map=truevim.keymap.set('i','<S-Tab>','copilot#Accept("\\<S-Tab>")', {expr=true,replace_keycodes=false })
You can also customize CopilotChat keymaps in your config.
All predefined functions belong to thecopilot group.
| Function | Type | Description | Example Usage |
|---|---|---|---|
bash | Tool | Executes a bash command and returns output | @copilot only |
buffer | Resource | Retrieves content from buffer(s) with diagnostics | #buffer:active |
clipboard | Resource | Provides access to system clipboard content | #clipboard |
edit | Tool | Applies a unified diff to a file | @copilot only |
file | Resource | Reads content from a specified file path | #file:path/to/file |
gitdiff | Resource | Retrieves git diff information | #gitdiff:staged |
glob | Resource | Lists filenames matching a pattern in workspace | #glob:**/*.lua |
grep | Resource | Searches for a pattern across files in workspace | #grep:TODO |
selection | Resource | Includes the current visual selection with diagnostics | #selection |
url | Resource | Fetches content from a specified URL | #url:https://... |
Type Legend:
- Resource: Can be used manually via
#functionsyntax - Tool: Can only be called by LLM via
@copilot(for safety/complexity reasons)
| Prompt | Description |
|---|---|
Explain | Write detailed explanation of selected code as paragraphs |
Review | Comprehensive code review with line-specific issue reporting |
Fix | Identify problems and rewrite code with fixes and explanation |
Optimize | Improve performance and readability with optimization strategy |
Docs | Add documentation comments to selected code |
Tests | Generate tests for selected code |
Commit | Generate commit message with commitizen convention from staged changes |
For all available configuration options, seelua/CopilotChat/config.lua.
Most users only need to configure a few options:
{model='gpt-4.1',-- AI model to usetemperature=0.1,-- Lower = focused, higher = creativewindow= {layout='vertical',-- 'vertical', 'horizontal', 'float'width=0.5,-- 50% of screen width },auto_insert_mode=true,-- Enter insert mode when opening}{window= {layout='float',width=80,-- Fixed width in columnsheight=20,-- Fixed height in rowsborder='rounded',-- 'single', 'double', 'rounded', 'solid'title='🤖 AI Assistant',zindex=100,-- Ensure window stays on top },headers= {user='👤 You',assistant='🤖 Copilot',tool='🔧 Tool', },separator='━━',auto_fold=true,-- Automatically folds non-assistant messages}-- Auto-command to customize chat buffer behaviorvim.api.nvim_create_autocmd('BufEnter', {pattern='copilot-*',callback=function()vim.opt_local.relativenumber=falsevim.opt_local.number=falsevim.opt_local.conceallevel=0end,})
You can customize colors by setting highlight groups in your config:
-- In your colorscheme or init.luavim.api.nvim_set_hl(0,'CopilotChatHeader', {fg='#7C3AED',bold=true })vim.api.nvim_set_hl(0,'CopilotChatSeparator', {fg='#374151'})
Types of copilot highlights:
CopilotChatHeader- Header highlight in chat bufferCopilotChatSeparator- Separator highlight in chat bufferCopilotChatSelection- Selection highlight in source bufferCopilotChatStatus- Status and spinner in chat bufferCopilotChatHelp- Help text in chat bufferCopilotChatResource- Resource highlight in chat buffer (e.g.#file,#gitdiff)CopilotChatTool- Tool call highlight in chat buffer (e.g.@copilot)CopilotChatPrompt- Prompt highlight in chat buffer (e.g./Explain,/Review)CopilotChatModel- Model highlight in chat buffer (e.g.$gpt-4.1)CopilotChatUri- URI highlight in chat buffer (e.g.##https://...)CopilotChatAnnotation- Annotation highlight in chat buffer (file headers, tool call headers, tool call body)
Define your own prompts in the configuration:
{prompts= {MyCustomPrompt= {prompt='Explain how it works.',system_prompt='You are very good at explaining stuff',mapping='<leader>ccmc',description='My custom prompt description', },Yarrr= {system_prompt='You are fascinated by pirates, so please respond in pirate speak.', },NiceInstructions= {system_prompt='You are a nice coding tutor, so please respond in a friendly and helpful manner.', } }}Define your own functions in the configuration with input handling and schema:
{functions= {birthday= {description="Retrieves birthday information for a person",uri="birthday://{name}",schema= {type='object',required= {'name'},properties= {name= {type='string',enum= {'Alice','Bob','Charlie'},description="Person's name", }, }, },resolve=function(input)return { {uri='birthday://'..input.name,mimetype='text/plain',data=input.name..' birthday info', } }end } }}Add custom AI providers:
{providers= {my_provider= {get_url=function(opts)return"https://api.example.com/chat"end,get_headers=function()return { ["Authorization"]="Bearer"..api_key }end,get_models=function()return { {id="gpt-4.1",name="GPT-4.1 model"} }end,prepare_input=require('CopilotChat.config.providers').copilot.prepare_input,prepare_output=require('CopilotChat.config.providers').copilot.prepare_output, } }}Provider Interface:
{-- Optional: Disable providerdisabled?:boolean,-- Optional: Extra info about the provider displayed in info panelget_info?():string[]-- Optional: Get extra request headers with optional expiration timeget_headers?():table<string,string>,number?,-- Optional: Get API endpoint URLget_url?(opts:CopilotChat.Provider.options):string,-- Optional: Prepare request inputprepare_input?(inputs:table<CopilotChat.Provider.input>,opts:CopilotChat.Provider.options):table,-- Optional: Prepare response outputprepare_output?(output:table,opts:CopilotChat.Provider.options):CopilotChat.Provider.output,-- Optional: Get available modelsget_models?(headers:table):table<CopilotChat.Provider.model>,}Built-in providers:
copilot- GitHub Copilot (default)github_models- GitHub Marketplace models (disabled by default)
localchat=require("CopilotChat")-- Basic Chat Functionschat.ask(prompt,config)-- Ask a question with optional configchat.response()-- Get the last response text-- Window Managementchat.open(config)-- Open chat window with optional configchat.close()-- Close chat windowchat.toggle(config)-- Toggle chat window visibility with optional configchat.reset()-- Reset the chatchat.stop()-- Stop current output-- Prompt & Model Managementchat.select_prompt(config)-- Open prompt selector with optional configchat.select_model()-- Open model selector-- History Managementchat.load(name,history_path)-- Load chat historychat.save(name,history_path)-- Save chat history-- Configurationchat.setup(config)-- Update configurationchat.log_level(level)-- Set log level (debug, info, etc.)
You can also access the chat window UI methods through thechat.chat object:
localwindow=require("CopilotChat").chat-- Chat UI Statewindow:visible()-- Check if chat window is visiblewindow:focused()-- Check if chat window is focused-- Message Managementwindow:get_message(role,cursor)-- Get chat message by role, either last or closest to cursorwindow:add_message({role,content },replace)-- Add or replace a message in chatwindow:remove_message(role,cursor)-- Remove chat message by role, either last or closest to cursorwindow:get_block(role,cursor)-- Get code block by role, either last or closest to cursor-- Content Managementwindow:append(text)-- Append text to chat windowwindow:clear()-- Clear chat window contentwindow:start()-- Start writing to chat windowwindow:finish()-- Finish writing to chat window-- Source Managementwindow.get_source()-- Get the current source buffer and windowwindow.set_source(winnr)-- Set the source window-- Navigationwindow:follow()-- Move cursor to end of chat contentwindow:focus()-- Focus the chat window-- Advanced Featureswindow:overlay(opts)-- Show overlay with specified options
localparser=require("CopilotChat.prompts")parser.resolve_prompt()-- Resolve prompt referencesparser.resolve_tools()-- Resolve tools that are available for automatic use by LLMparser.resolve_model()-- Resolve model from prompt (WARN: async, requires plenary.async.run)
-- Open chat, ask a question and handle responserequire("CopilotChat").open()require("CopilotChat").ask("#buffer Explain this code", {callback=function(response)vim.notify("Got response:"..response:sub(1,50).."...")returnresponseend,})-- Save and load chat historyrequire("CopilotChat").save("my_debugging_session")require("CopilotChat").load("my_debugging_session")-- Use custom sticky and modelrequire("CopilotChat").ask("How can I optimize this?", {model="gpt-4.1",sticky= {"#buffer","#gitdiff:staged"}})
For more examples, see theexamples wiki page.
To set up the environment:
- Clone the repository:
git clone https://github.com/CopilotC-Nvim/CopilotChat.nvimcd CopilotChat.nvim- Install development dependencies:
make install-pre-commit
To run tests:
maketest- Fork the repository
- Create your feature branch
- Make your changes
- Run tests and lint checks
- Submit a pull request
SeeCONTRIBUTING.md for detailed guidelines.
CopilotChat.nvim includesdiff-match-patch (Lua port) for diffing and patching functionality.
Copyright 2018 The diff-match-patch Authors.
Licensed under the Apache License 2.0.
Thanks goes to these wonderful people (emoji key):
This project follows theall-contributors specification. Contributions of any kind are welcome!
About
Chat with GitHub Copilot in Neovim
Topics
Resources
License
Contributing
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Sponsor this project
Uh oh!
There was an error while loading.Please reload this page.
Packages0
Uh oh!
There was an error while loading.Please reload this page.
