You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Let Claude Code (and other Anthropic-shape clients) hit `/v1/messages` with an Anthropic body and have copilot-api dispatch all the way to upstream Copilot's `/responses` endpoint when the target model is Codex/o-pro/etc. Today `/v1/messages` only routes to `/chat/completions` (`src/routes/messages/anthropic-to-openai.ts` and friends).
Current state
The Anthropic adapter chain is exclusively chat-completions:
Wire the model-mode classifier (Model-to-endpoint routing (chat vs responses) #5) into `/v1/messages` so requests targeting `responses`-mode models go through this new adapter instead of the chat-completions path
Part of #1. Depends on #4, #5.
Goal
Let Claude Code (and other Anthropic-shape clients) hit `/v1/messages` with an Anthropic body and have copilot-api dispatch all the way to upstream Copilot's `/responses` endpoint when the target model is Codex/o-pro/etc. Today `/v1/messages` only routes to `/chat/completions` (`src/routes/messages/anthropic-to-openai.ts` and friends).
Current state
The Anthropic adapter chain is exclusively chat-completions:
`AnthropicMessagesPayload.thinking?` (`src/routes/messages/anthropic-types.ts:21–24`) is parsed but silently dropped.
Tasks
Acceptance criteria
Reference impl
litellm: `litellm/llms/anthropic/experimental_pass_through/responses_adapters/transformation.py` — `LiteLLMAnthropicToResponsesAPIAdapter`