A Go SDK for programmatic access to the GitHub Copilot CLI.
Note: This SDK is in technical preview and may change in breaking ways.
go get github.com/github/copilot-sdk/gopackage main
import (
"fmt"
"log"
copilot "github.com/github/copilot-sdk/go"
)
func main() {
// Create client
client := copilot.NewClient(&copilot.ClientOptions{
LogLevel: "error",
})
// Start the client
if err := client.Start(); err != nil {
log.Fatal(err)
}
defer client.Stop()
// Create a session
session, err := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
})
if err != nil {
log.Fatal(err)
}
defer session.Destroy()
// Set up event handler
done := make(chan bool)
session.On(func(event copilot.SessionEvent) {
if event.Type == "assistant.message" {
if event.Data.Content != nil {
fmt.Println(*event.Data.Content)
}
}
if event.Type == "session.idle" {
close(done)
}
})
// Send a message
_, err = session.Send(copilot.MessageOptions{
Prompt: "What is 2+2?",
})
if err != nil {
log.Fatal(err)
}
// Wait for completion
<-done
}NewClient(options *ClientOptions) *Client- Create a new clientStart() error- Start the CLI serverStop() []error- Stop the CLI server (returns array of errors, empty if all succeeded)ForceStop()- Forcefully stop without graceful cleanupCreateSession(config *SessionConfig) (*Session, error)- Create a new sessionResumeSession(sessionID string) (*Session, error)- Resume an existing sessionResumeSessionWithOptions(sessionID string, config *ResumeSessionConfig) (*Session, error)- Resume with additional configurationListSessions() ([]SessionMetadata, error)- List all sessions known to the serverDeleteSession(sessionID string) error- Delete a session permanentlyGetState() ConnectionState- Get connection statePing(message string) (*PingResponse, error)- Ping the server
ClientOptions:
CLIPath(string): Path to CLI executable (default: "copilot" orCOPILOT_CLI_PATHenv var)CLIUrl(string): URL of existing CLI server (e.g.,"localhost:8080","http://127.0.0.1:9000", or just"8080"). When provided, the client will not spawn a CLI process.Cwd(string): Working directory for CLI processPort(int): Server port for TCP mode (default: 0 for random)UseStdio(bool): Use stdio transport instead of TCP (default: true)LogLevel(string): Log level (default: "info")AutoStart(*bool): Auto-start server on first use (default: true). UseBool(false)to disable.AutoRestart(*bool): Auto-restart on crash (default: true). UseBool(false)to disable.Env([]string): Environment variables for CLI process (default: inherits from current process)
SessionConfig:
Model(string): Model to use ("gpt-5", "claude-sonnet-4.5", etc.). Required when using custom provider.SessionID(string): Custom session IDTools([]Tool): Custom tools exposed to the CLISystemMessage(*SystemMessageConfig): System message configurationProvider(*ProviderConfig): Custom API provider configuration (BYOK). See Custom Providers section.Streaming(bool): Enable streaming delta eventsInfiniteSessions(*InfiniteSessionConfig): Automatic context compaction configuration
ResumeSessionConfig:
Tools([]Tool): Tools to expose when resumingProvider(*ProviderConfig): Custom API provider configuration (BYOK). See Custom Providers section.Streaming(bool): Enable streaming delta events
Send(options MessageOptions) (string, error)- Send a messageOn(handler SessionEventHandler) func()- Subscribe to events (returns unsubscribe function)Abort() error- Abort the currently processing messageGetMessages() ([]SessionEvent, error)- Get message historyDestroy() error- Destroy the session
Bool(v bool) *bool- Helper to create bool pointers forAutoStart/AutoRestartoptions
The SDK supports image attachments via the Attachments field in MessageOptions. You can attach images by providing their file path:
_, err = session.Send(copilot.MessageOptions{
Prompt: "What's in this image?",
Attachments: []copilot.Attachment{
{
Type: "file",
Path: "/path/to/image.jpg",
},
},
})Supported image formats include JPG, PNG, GIF, and other common image types. The agent's view tool can also read images directly from the filesystem, so you can also ask questions like:
_, err = session.Send(copilot.MessageOptions{
Prompt: "What does the most recent jpg in this directory portray?",
})Expose your own functionality to Copilot by attaching tools to a session.
Use DefineTool for type-safe tools with automatic JSON schema generation:
type LookupIssueParams struct {
ID string `json:"id" jsonschema:"Issue identifier"`
}
lookupIssue := copilot.DefineTool("lookup_issue", "Fetch issue details from our tracker",
func(params LookupIssueParams, inv copilot.ToolInvocation) (any, error) {
// params is automatically unmarshaled from the LLM's arguments
issue, err := fetchIssue(params.ID)
if err != nil {
return nil, err
}
return issue.Summary, nil
})
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
Tools: []copilot.Tool{lookupIssue},
})For more control over the JSON schema, use the Tool struct directly:
lookupIssue := copilot.Tool{
Name: "lookup_issue",
Description: "Fetch issue details from our tracker",
Parameters: map[string]interface{}{
"type": "object",
"properties": map[string]interface{}{
"id": map[string]interface{}{
"type": "string",
"description": "Issue identifier",
},
},
"required": []string{"id"},
},
Handler: func(invocation copilot.ToolInvocation) (copilot.ToolResult, error) {
args := invocation.Arguments.(map[string]interface{})
issue, err := fetchIssue(args["id"].(string))
if err != nil {
return copilot.ToolResult{}, err
}
return copilot.ToolResult{
TextResultForLLM: issue.Summary,
ResultType: "success",
SessionLog: fmt.Sprintf("Fetched issue %s", issue.ID),
}, nil
},
}
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
Tools: []copilot.Tool{lookupIssue},
})When the model selects a tool, the SDK automatically runs your handler (in parallel with other calls) and responds to the CLI's tool.call with the handler's result.
Enable streaming to receive assistant response chunks as they're generated:
package main
import (
"fmt"
"log"
copilot "github.com/github/copilot-sdk/go"
)
func main() {
client := copilot.NewClient(nil)
if err := client.Start(); err != nil {
log.Fatal(err)
}
defer client.Stop()
session, err := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
Streaming: true,
})
if err != nil {
log.Fatal(err)
}
defer session.Destroy()
done := make(chan bool)
session.On(func(event copilot.SessionEvent) {
if event.Type == "assistant.message_delta" {
// Streaming message chunk - print incrementally
if event.Data.DeltaContent != nil {
fmt.Print(*event.Data.DeltaContent)
}
} else if event.Type == "assistant.reasoning_delta" {
// Streaming reasoning chunk (if model supports reasoning)
if event.Data.DeltaContent != nil {
fmt.Print(*event.Data.DeltaContent)
}
} else if event.Type == "assistant.message" {
// Final message - complete content
fmt.Println("\n--- Final message ---")
if event.Data.Content != nil {
fmt.Println(*event.Data.Content)
}
} else if event.Type == "assistant.reasoning" {
// Final reasoning content (if model supports reasoning)
fmt.Println("--- Reasoning ---")
if event.Data.Content != nil {
fmt.Println(*event.Data.Content)
}
}
if event.Type == "session.idle" {
close(done)
}
})
_, err = session.Send(copilot.MessageOptions{
Prompt: "Tell me a short story",
})
if err != nil {
log.Fatal(err)
}
<-done
}When Streaming: true:
assistant.message_deltaevents are sent withDeltaContentcontaining incremental textassistant.reasoning_deltaevents are sent withDeltaContentfor reasoning/chain-of-thought (model-dependent)- Accumulate
DeltaContentvalues to build the full response progressively - The final
assistant.messageandassistant.reasoningevents contain the complete content
Note: assistant.message and assistant.reasoning (final events) are always sent regardless of streaming setting.
By default, sessions use infinite sessions which automatically manage context window limits through background compaction and persist state to a workspace directory.
// Default: infinite sessions enabled with default thresholds
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
})
// Access the workspace path for checkpoints and files
fmt.Println(session.WorkspacePath())
// => ~/.copilot/session-state/{sessionId}/
// Custom thresholds
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
InfiniteSessions: &copilot.InfiniteSessionConfig{
Enabled: copilot.Bool(true),
BackgroundCompactionThreshold: copilot.Float64(0.80), // Start compacting at 80% context usage
BufferExhaustionThreshold: copilot.Float64(0.95), // Block at 95% until compaction completes
},
})
// Disable infinite sessions
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-5",
InfiniteSessions: &copilot.InfiniteSessionConfig{
Enabled: copilot.Bool(false),
},
})When enabled, sessions emit compaction events:
session.compaction_start- Background compaction startedsession.compaction_complete- Compaction finished (includes token counts)
The SDK supports custom OpenAI-compatible API providers (BYOK - Bring Your Own Key), including local providers like Ollama. When using a custom provider, you must specify the Model explicitly.
ProviderConfig:
Type(string): Provider type - "openai", "azure", or "anthropic" (default: "openai")BaseURL(string): API endpoint URL (required)APIKey(string): API key (optional for local providers like Ollama)BearerToken(string): Bearer token for authentication (takes precedence over APIKey)WireApi(string): API format for OpenAI/Azure - "completions" or "responses" (default: "completions")Azure.APIVersion(string): Azure API version (default: "2024-10-21")
Example with Ollama:
session, err := client.CreateSession(&copilot.SessionConfig{
Model: "deepseek-coder-v2:16b", // Required when using custom provider
Provider: &copilot.ProviderConfig{
Type: "openai",
BaseURL: "http://localhost:11434/v1", // Ollama endpoint
// APIKey not required for Ollama
},
})Example with custom OpenAI-compatible API:
session, err := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-4",
Provider: &copilot.ProviderConfig{
Type: "openai",
BaseURL: "https://my-api.example.com/v1",
APIKey: os.Getenv("MY_API_KEY"),
},
})Example with Azure OpenAI:
session, err := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-4",
Provider: &copilot.ProviderConfig{
Type: "azure", // Must be "azure" for Azure endpoints, NOT "openai"
BaseURL: "https://my-resource.openai.azure.com", // Just the host, no path
APIKey: os.Getenv("AZURE_OPENAI_KEY"),
Azure: &copilot.AzureProviderOptions{
APIVersion: "2024-10-21",
},
},
})Important notes:
- When using a custom provider, the
Modelparameter is required. The SDK will return an error if no model is specified.- For Azure OpenAI endpoints (
*.openai.azure.com), you must useType: "azure", notType: "openai".- The
BaseURLshould be just the host (e.g.,https://my-resource.openai.azure.com). Do not include/openai/v1in the URL - the SDK handles path construction automatically.
Communicates with CLI via stdin/stdout pipes. Recommended for most use cases.
client := copilot.NewClient(nil) // Uses stdio by defaultCommunicates with CLI via TCP socket. Useful for distributed scenarios.
COPILOT_CLI_PATH- Path to the Copilot CLI executable
MIT