Problem
The proxy currently exposes /chat/completions, /v1/chat/completions, and related compatibility routes, but it does not expose OpenAI Responses API routes. This blocks models that are available only through Responses API, such as gpt-5.3-codex, even when the underlying GitHub Copilot account can use them.
Proposed change
- add
/responses
- add
/v1/responses
- forward requests to the upstream GitHub Copilot
/responses endpoint
- include verbose logging and token-count reporting similar to the existing chat-completions handler
Related PR
Implemented here: #230
Validation
- source build passes with
bun run build
- locally verified
gpt-5.3-codex and gpt-5.4-mini through /v1/responses
- verified that
gpt-5.3-codex is not usable via /chat/completions, so Responses support is required
Problem
The proxy currently exposes
/chat/completions,/v1/chat/completions, and related compatibility routes, but it does not expose OpenAI Responses API routes. This blocks models that are available only through Responses API, such asgpt-5.3-codex, even when the underlying GitHub Copilot account can use them.Proposed change
/responses/v1/responses/responsesendpointRelated PR
Implemented here: #230
Validation
bun run buildgpt-5.3-codexandgpt-5.4-minithrough/v1/responsesgpt-5.3-codexis not usable via/chat/completions, so Responses support is required