Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
This repository was archived by the owner on Sep 18, 2025. It is now read-only.
/opencodePublic archive

Commitb9bedba

Browse files
authored
feat: add github copilot provider (#230)
* feat: add github copilot* fix: add support for claude4
1 parent73729ef commitb9bedba

File tree

13 files changed

+1276
-48
lines changed

13 files changed

+1276
-48
lines changed

‎.gitignore‎

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,3 +44,4 @@ Thumbs.db
4444
.opencode/
4545

4646
opencode
47+
opencode.md

‎README.md‎

Lines changed: 56 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -96,22 +96,23 @@ You can enable or disable this feature in your configuration file:
9696

9797
You can configure OpenCode using environment variables:
9898

99-
| Environment Variable| Purpose|
100-
| --------------------------| ------------------------------------------------------|
101-
|`ANTHROPIC_API_KEY`| For Claude models|
102-
|`OPENAI_API_KEY`| For OpenAI models|
103-
|`GEMINI_API_KEY`| For Google Gemini models|
104-
|`VERTEXAI_PROJECT`| For Google Cloud VertexAI (Gemini)|
105-
|`VERTEXAI_LOCATION`| For Google Cloud VertexAI (Gemini)|
106-
|`GROQ_API_KEY`| For Groq models|
107-
|`AWS_ACCESS_KEY_ID`| For AWS Bedrock (Claude)|
108-
|`AWS_SECRET_ACCESS_KEY`| For AWS Bedrock (Claude)|
109-
|`AWS_REGION`| For AWS Bedrock (Claude)|
110-
|`AZURE_OPENAI_ENDPOINT`| For Azure OpenAI models|
111-
|`AZURE_OPENAI_API_KEY`| For Azure OpenAI models (optional when using Entra ID)|
112-
|`AZURE_OPENAI_API_VERSION`| For Azure OpenAI models|
113-
|`LOCAL_ENDPOINT`| For self-hosted models|
114-
|`SHELL`| Default shell to use (if not specified in config)|
99+
| Environment Variable| Purpose|
100+
| --------------------------| --------------------------------------------------------------------------------|
101+
|`ANTHROPIC_API_KEY`| For Claude models|
102+
|`OPENAI_API_KEY`| For OpenAI models|
103+
|`GEMINI_API_KEY`| For Google Gemini models|
104+
|`GITHUB_TOKEN`| For Github Copilot models (see[Using Github Copilot](#using-github-copilot))|
105+
|`VERTEXAI_PROJECT`| For Google Cloud VertexAI (Gemini)|
106+
|`VERTEXAI_LOCATION`| For Google Cloud VertexAI (Gemini)|
107+
|`GROQ_API_KEY`| For Groq models|
108+
|`AWS_ACCESS_KEY_ID`| For AWS Bedrock (Claude)|
109+
|`AWS_SECRET_ACCESS_KEY`| For AWS Bedrock (Claude)|
110+
|`AWS_REGION`| For AWS Bedrock (Claude)|
111+
|`AZURE_OPENAI_ENDPOINT`| For Azure OpenAI models|
112+
|`AZURE_OPENAI_API_KEY`| For Azure OpenAI models (optional when using Entra ID)|
113+
|`AZURE_OPENAI_API_VERSION`| For Azure OpenAI models|
114+
|`LOCAL_ENDPOINT`| For self-hosted models|
115+
|`SHELL`| Default shell to use (if not specified in config)|
115116

116117
###Shell Configuration
117118

@@ -146,6 +147,9 @@ This is useful if you want to use a different shell than your default system she
146147
"apiKey":"your-api-key",
147148
"disabled":false
148149
},
150+
"copilot": {
151+
"disabled":false
152+
},
149153
"groq": {
150154
"apiKey":"your-api-key",
151155
"disabled":false
@@ -216,6 +220,23 @@ OpenCode supports a variety of AI models from different providers:
216220
- Claude 3 Haiku
217221
- Claude 3 Opus
218222

223+
###GitHub Copilot
224+
225+
- GPT-3.5 Turbo
226+
- GPT-4
227+
- GPT-4o
228+
- GPT-4o Mini
229+
- GPT-4.1
230+
- Claude 3.5 Sonnet
231+
- Claude 3.7 Sonnet
232+
- Claude 3.7 Sonnet Thinking
233+
- Claude Sonnet 4
234+
- O1
235+
- O3 Mini
236+
- O4 Mini
237+
- Gemini 2.0 Flash
238+
- Gemini 2.5 Pro
239+
219240
###Google
220241

221242
- Gemini 2.5
@@ -579,6 +600,25 @@ The AI assistant can access LSP features through the `diagnostics` tool, allowin
579600

580601
While the LSP client implementation supports the full LSP protocol (including completions, hover, definition, etc.), currently only diagnostics are exposed to the AI assistant.
581602

603+
##Using Github Copilot
604+
605+
_Copilot support is currently experimental._
606+
607+
###Requirements
608+
-[Copilot chat in the IDE](https://github.com/settings/copilot) enabled in GitHub settings
609+
- One of:
610+
- VSCode Github Copilot chat extension
611+
- Github`gh` CLI
612+
- Neovim Github Copilot plugin (`copilot.vim` or`copilot.lua`)
613+
- Github token with copilot permissions
614+
615+
If using one of the above plugins or cli tools, make sure you use the authenticate
616+
the tool with your github account. This should create a github token at one of the following locations:
617+
-~/.config/github-copilot/[hosts,apps].json
618+
- $XDG_CONFIG_HOME/github-copilot/[hosts,apps].json
619+
620+
If using an explicit github token, you may either set the $GITHUB_TOKEN environment variable or add it to the opencode.json config file at`providers.copilot.apiKey`.
621+
582622
##Using a self-hosted model provider
583623

584624
OpenCode can also load and use models from a self-hosted (OpenAI-like) provider.

‎internal/config/config.go‎

Lines changed: 108 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ import (
77
"log/slog"
88
"os"
99
"path/filepath"
10+
"runtime"
1011
"strings"
1112

1213
"github.com/opencode-ai/opencode/internal/llm/models"
@@ -161,6 +162,7 @@ func Load(workingDir string, debug bool) (*Config, error) {
161162
}
162163
ifos.Getenv("OPENCODE_DEV_DEBUG")=="true" {
163164
loggingFile:=fmt.Sprintf("%s/%s",cfg.Data.Directory,"debug.log")
165+
messagesPath:=fmt.Sprintf("%s/%s",cfg.Data.Directory,"messages")
164166

165167
// if file does not exist create it
166168
if_,err:=os.Stat(loggingFile);os.IsNotExist(err) {
@@ -172,6 +174,13 @@ func Load(workingDir string, debug bool) (*Config, error) {
172174
}
173175
}
174176

177+
if_,err:=os.Stat(messagesPath);os.IsNotExist(err) {
178+
iferr:=os.MkdirAll(messagesPath,0o756);err!=nil {
179+
returncfg,fmt.Errorf("failed to create directory: %w",err)
180+
}
181+
}
182+
logging.MessageDir=messagesPath
183+
175184
sloggingFileWriter,err:=os.OpenFile(loggingFile,os.O_CREATE|os.O_WRONLY|os.O_APPEND,0o666)
176185
iferr!=nil {
177186
returncfg,fmt.Errorf("failed to open log file: %w",err)
@@ -245,6 +254,7 @@ func setDefaults(debug bool) {
245254
// environment variables and configuration file.
246255
funcsetProviderDefaults() {
247256
// Set all API keys we can find in the environment
257+
// Note: Viper does not default if the json apiKey is ""
248258
ifapiKey:=os.Getenv("ANTHROPIC_API_KEY");apiKey!="" {
249259
viper.SetDefault("providers.anthropic.apiKey",apiKey)
250260
}
@@ -267,16 +277,32 @@ func setProviderDefaults() {
267277
// api-key may be empty when using Entra ID credentials – that's okay
268278
viper.SetDefault("providers.azure.apiKey",os.Getenv("AZURE_OPENAI_API_KEY"))
269279
}
280+
ifapiKey,err:=LoadGitHubToken();err==nil&&apiKey!="" {
281+
viper.SetDefault("providers.copilot.apiKey",apiKey)
282+
ifviper.GetString("providers.copilot.apiKey")=="" {
283+
viper.Set("providers.copilot.apiKey",apiKey)
284+
}
285+
}
270286

271287
// Use this order to set the default models
272-
// 1. Anthropic
273-
// 2. OpenAI
274-
// 3. Google Gemini
275-
// 4. Groq
276-
// 5. OpenRouter
277-
// 6. AWS Bedrock
278-
// 7. Azure
279-
// 8. Google Cloud VertexAI
288+
// 1. Copilot
289+
// 2. Anthropic
290+
// 3. OpenAI
291+
// 4. Google Gemini
292+
// 5. Groq
293+
// 6. OpenRouter
294+
// 7. AWS Bedrock
295+
// 8. Azure
296+
// 9. Google Cloud VertexAI
297+
298+
// copilot configuration
299+
ifkey:=viper.GetString("providers.copilot.apiKey");strings.TrimSpace(key)!="" {
300+
viper.SetDefault("agents.coder.model",models.CopilotGPT4o)
301+
viper.SetDefault("agents.summarizer.model",models.CopilotGPT4o)
302+
viper.SetDefault("agents.task.model",models.CopilotGPT4o)
303+
viper.SetDefault("agents.title.model",models.CopilotGPT4o)
304+
return
305+
}
280306

281307
// Anthropic configuration
282308
ifkey:=viper.GetString("providers.anthropic.apiKey");strings.TrimSpace(key)!="" {
@@ -399,6 +425,14 @@ func hasVertexAICredentials() bool {
399425
returnfalse
400426
}
401427

428+
funchasCopilotCredentials()bool {
429+
// Check for explicit Copilot parameters
430+
iftoken,_:=LoadGitHubToken();token!="" {
431+
returntrue
432+
}
433+
returnfalse
434+
}
435+
402436
// readConfig handles the result of reading a configuration file.
403437
funcreadConfig(errerror)error {
404438
iferr==nil {
@@ -440,6 +474,9 @@ func applyDefaultValues() {
440474
// It validates model IDs and providers, ensuring they are supported.
441475
funcvalidateAgent(cfg*Config,nameAgentName,agentAgent)error {
442476
// Check if model exists
477+
// TODO:If a copilot model is specified, but model is not found,
478+
// it might be new model. The https://api.githubcopilot.com/models
479+
// endpoint should be queried to validate if the model is supported.
443480
model,modelExists:=models.SupportedModels[agent.Model]
444481
if!modelExists {
445482
logging.Warn("unsupported model configured, reverting to default",
@@ -584,6 +621,7 @@ func Validate() error {
584621
// Validate providers
585622
forprovider,providerCfg:=rangecfg.Providers {
586623
ifproviderCfg.APIKey==""&&!providerCfg.Disabled {
624+
fmt.Printf("provider has no API key, marking as disabled %s",provider)
587625
logging.Warn("provider has no API key, marking as disabled","provider",provider)
588626
providerCfg.Disabled=true
589627
cfg.Providers[provider]=providerCfg
@@ -631,6 +669,18 @@ func getProviderAPIKey(provider models.ModelProvider) string {
631669

632670
// setDefaultModelForAgent sets a default model for an agent based on available providers
633671
funcsetDefaultModelForAgent(agentAgentName)bool {
672+
ifhasCopilotCredentials() {
673+
maxTokens:=int64(5000)
674+
ifagent==AgentTitle {
675+
maxTokens=80
676+
}
677+
678+
cfg.Agents[agent]=Agent{
679+
Model:models.CopilotGPT4o,
680+
MaxTokens:maxTokens,
681+
}
682+
returntrue
683+
}
634684
// Check providers in order of preference
635685
ifapiKey:=os.Getenv("ANTHROPIC_API_KEY");apiKey!="" {
636686
maxTokens:=int64(5000)
@@ -878,3 +928,53 @@ func UpdateTheme(themeName string) error {
878928
config.TUI.Theme=themeName
879929
})
880930
}
931+
932+
// Tries to load Github token from all possible locations
933+
funcLoadGitHubToken() (string,error) {
934+
// First check environment variable
935+
iftoken:=os.Getenv("GITHUB_TOKEN");token!="" {
936+
returntoken,nil
937+
}
938+
939+
// Get config directory
940+
varconfigDirstring
941+
ifxdgConfig:=os.Getenv("XDG_CONFIG_HOME");xdgConfig!="" {
942+
configDir=xdgConfig
943+
}elseifruntime.GOOS=="windows" {
944+
iflocalAppData:=os.Getenv("LOCALAPPDATA");localAppData!="" {
945+
configDir=localAppData
946+
}else {
947+
configDir=filepath.Join(os.Getenv("HOME"),"AppData","Local")
948+
}
949+
}else {
950+
configDir=filepath.Join(os.Getenv("HOME"),".config")
951+
}
952+
953+
// Try both hosts.json and apps.json files
954+
filePaths:= []string{
955+
filepath.Join(configDir,"github-copilot","hosts.json"),
956+
filepath.Join(configDir,"github-copilot","apps.json"),
957+
}
958+
959+
for_,filePath:=rangefilePaths {
960+
data,err:=os.ReadFile(filePath)
961+
iferr!=nil {
962+
continue
963+
}
964+
965+
varconfigmap[string]map[string]interface{}
966+
iferr:=json.Unmarshal(data,&config);err!=nil {
967+
continue
968+
}
969+
970+
forkey,value:=rangeconfig {
971+
ifstrings.Contains(key,"github.com") {
972+
ifoauthToken,ok:=value["oauth_token"].(string);ok {
973+
returnoauthToken,nil
974+
}
975+
}
976+
}
977+
}
978+
979+
return"",fmt.Errorf("GitHub token not found in standard locations")
980+
}

‎internal/llm/agent/agent.go‎

Lines changed: 21 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -162,6 +162,7 @@ func (a *agent) generateTitle(ctx context.Context, sessionID string, content str
162162
iferr!=nil {
163163
returnerr
164164
}
165+
ctx=context.WithValue(ctx,tools.SessionIDContextKey,sessionID)
165166
parts:= []message.ContentPart{message.TextContent{Text:content}}
166167
response,err:=a.titleProvider.SendMessages(
167168
ctx,
@@ -230,6 +231,7 @@ func (a *agent) Run(ctx context.Context, sessionID string, content string, attac
230231
}
231232

232233
func (a*agent)processGeneration(ctx context.Context,sessionID,contentstring,attachmentParts []message.ContentPart)AgentEvent {
234+
cfg:=config.Get()
233235
// List existing messages; if none, start title generation asynchronously.
234236
msgs,err:=a.messages.List(ctx,sessionID)
235237
iferr!=nil {
@@ -288,7 +290,13 @@ func (a *agent) processGeneration(ctx context.Context, sessionID, content string
288290
}
289291
returna.err(fmt.Errorf("failed to process events: %w",err))
290292
}
291-
logging.Info("Result","message",agentMessage.FinishReason(),"toolResults",toolResults)
293+
ifcfg.Debug {
294+
seqId:= (len(msgHistory)+1)/2
295+
toolResultFilepath:=logging.WriteToolResultsJson(sessionID,seqId,toolResults)
296+
logging.Info("Result","message",agentMessage.FinishReason(),"toolResults","{}","filepath",toolResultFilepath)
297+
}else {
298+
logging.Info("Result","message",agentMessage.FinishReason(),"toolResults",toolResults)
299+
}
292300
if (agentMessage.FinishReason()==message.FinishReasonToolUse)&&toolResults!=nil {
293301
// We are not done, we need to respond with the tool response
294302
msgHistory=append(msgHistory,agentMessage,*toolResults)
@@ -312,6 +320,7 @@ func (a *agent) createUserMessage(ctx context.Context, sessionID, content string
312320
}
313321

314322
func (a*agent)streamAndHandleEvents(ctx context.Context,sessionIDstring,msgHistory []message.Message) (message.Message,*message.Message,error) {
323+
ctx=context.WithValue(ctx,tools.SessionIDContextKey,sessionID)
315324
eventChan:=a.provider.StreamResponse(ctx,msgHistory,a.tools)
316325

317326
assistantMsg,err:=a.messages.Create(ctx,sessionID, message.CreateMessageParams{
@@ -325,7 +334,6 @@ func (a *agent) streamAndHandleEvents(ctx context.Context, sessionID string, msg
325334

326335
// Add the session and message ID into the context if needed by tools.
327336
ctx=context.WithValue(ctx,tools.MessageIDContextKey,assistantMsg.ID)
328-
ctx=context.WithValue(ctx,tools.SessionIDContextKey,sessionID)
329337

330338
// Process each event in the stream.
331339
forevent:=rangeeventChan {
@@ -357,10 +365,17 @@ func (a *agent) streamAndHandleEvents(ctx context.Context, sessionID string, msg
357365
default:
358366
// Continue processing
359367
vartool tools.BaseTool
360-
for_,availableTools:=rangea.tools {
361-
ifavailableTools.Info().Name==toolCall.Name {
362-
tool=availableTools
368+
for_,availableTool:=rangea.tools {
369+
ifavailableTool.Info().Name==toolCall.Name {
370+
tool=availableTool
371+
break
363372
}
373+
// Monkey patch for Copilot Sonnet-4 tool repetition obfuscation
374+
// if strings.HasPrefix(toolCall.Name, availableTool.Info().Name) &&
375+
// strings.HasPrefix(toolCall.Name, availableTool.Info().Name+availableTool.Info().Name) {
376+
// tool = availableTool
377+
// break
378+
// }
364379
}
365380

366381
// Tool not found
@@ -553,6 +568,7 @@ func (a *agent) Summarize(ctx context.Context, sessionID string) error {
553568
a.Publish(pubsub.CreatedEvent,event)
554569
return
555570
}
571+
summarizeCtx=context.WithValue(summarizeCtx,tools.SessionIDContextKey,sessionID)
556572

557573
iflen(msgs)==0 {
558574
event=AgentEvent{

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp