A wrapper around GitHub Copilot API to make it OpenAI compatible, making it usable for other tools.
copilot-api-demo.mp4
- Bun (>= 1.2.x)
- GitHub account with Copilot Individual subscription
To install dependencies, run:
bun installYou can run the project directly using npx:
npx copilot-api@latestWith options:
npx copilot-api --port 8080 --emulate-streamingThe project can be run from source in several ways:
bun run devStarts the server with hot reloading enabled, which automatically restarts the server when code changes are detected. This is ideal for development.
bun run startRuns the server in production mode with hot reloading disabled. Use this for deployment or production environments.
The server accepts several command line options:
| Option | Description | Default |
|---|---|---|
| --help, -h | Show help message | false |
| --port, -p | Port to listen on | 4141 |
| --verbose, -v | Enable verbose logging | false |
| --log-file | File path for logging | - |
Example with options:
bun run start --port 8080 --emulate-streamingIn all cases, the server will start and listen for API requests on the specified port.
| Tool | Status | Notes |
|---|---|---|
| Aider | Full | Fully compatible |
| bolt.diy | Full | Fully compatible; use any random API key in UI if models fail to load |
| Page Assist | Full | Fully compatible |
| Kobold AI Lite | Full | Fully compatible |
Note: In general, any application that uses the standard OpenAI-compatible /chat/completions and /models endpoints should work with this API.