Fix missing model field in first streaming response for OpenAI API compatibility#1
Closed
Fix missing model field in first streaming response for OpenAI API compatibility#1
model field in first streaming response for OpenAI API compatibility#1Conversation
Copilot
AI
changed the title
[WIP] Fix issue in fork of copilot API
Fix missing Oct 17, 2025
model field in first streaming response for OpenAI API compatibility
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
The OpenAI Chat Completions API requires that each Server-Sent Event (SSE) response contains the fields
choices,id, andmodel. However, the first streaming response from this proxy was missing themodelfield, causing compatibility issues with OpenAI-compatible clients.Here's an example of what the first response looked like:
{ "choices": [], "created": 0, "id": "", "prompt_filter_results": [...] }Notice the missing
modelfield that should be present according to the OpenAI API specification.Root Cause
In
src/routes/chat-completions/handler.ts, the streaming handler was incorrectly passing raw SSE event objects directly tostream.writeSSE()without extracting thedatafield:The variable
chunkis actually aServerSentEventMessageobject fromfetch-event-streamwith the structure:By casting and passing this raw event object directly, the handler was sending malformed SSE events to clients. The actual chat completion data (which includes the
modelfield) is in thedataproperty as a JSON string.Solution
This PR fixes the issue by properly extracting the
datafield from raw SSE events before forwarding them to clients:This approach:
datafield which contains the properly formatted JSON from GitHub Copilot (including themodelfield)[DONE]sentinel that marks the end of the streamsrc/routes/messages/handler.tsfor Anthropic API streamingChanges
src/routes/chat-completions/handler.tsto properly parse and forward SSE eventsSSEMessagetype importchunktorawEventfor clarityTesting
Fixes ericc-ch#133
Original prompt
💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.