Skip to content

Add Responses API compatibility alongside chat/completions #232

@toreleon

Description

@toreleon

Problem

The proxy currently exposes /chat/completions, /v1/chat/completions, and related compatibility routes, but it does not expose OpenAI Responses API routes. This blocks models that are available only through Responses API, such as gpt-5.3-codex, even when the underlying GitHub Copilot account can use them.

Proposed change

  • add /responses
  • add /v1/responses
  • forward requests to the upstream GitHub Copilot /responses endpoint
  • include verbose logging and token-count reporting similar to the existing chat-completions handler

Related PR

Implemented here: #230

Validation

  • source build passes with bun run build
  • locally verified gpt-5.3-codex and gpt-5.4-mini through /v1/responses
  • verified that gpt-5.3-codex is not usable via /chat/completions, so Responses support is required

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions