feature about anthropic count token and fix token usage problem and update claude code settings #115
Merged
ericc-ch merged 5 commits intoericc-ch:masterfrom Oct 4, 2025
Merged
Conversation
- Add new command line option `--claude-code-env` to generate environment variables for Claude Code - Update `runServer` function to use new `claudeCodeEnv` option - Add `claude-code-env` to `start` command options in README.md
ce6f058 to
e0c83ee
Compare
Contributor
Author
|
when message_start miss usage input_tokens, WebFetch will get erorr API Error: Cannot read properties of undefined (reading 'input_tokens') . so , force-pushed to revoke lastest commit |
9015b1c to
574d47a
Compare
dylbarne
reviewed
Oct 1, 2025
| "ANTHROPIC_MODEL": "gpt-4.1", | ||
| "ANTHROPIC_SMALL_FAST_MODEL": "gpt-4.1" | ||
| "ANTHROPIC_DEFAULT_SONNET_MODEL": "gpt-4.1", | ||
| "ANTHROPIC_SMALL_FAST_MODEL": "gpt-4.1", |
There was a problem hiding this comment.
can probably be removed here +corresponding start.ts, anthropic have deprecated in favor of ANTHROPIC_DEFAULT_HAIKU_MODEL which you have now also included,
I think can also close #89 then, since original question and followup comment are addressed by this PR
Contributor
Author
There was a problem hiding this comment.
ANTHROPIC_SMALL_FAST_MODEL For compatibility with lower versions
Owner
|
Man this is a godsend PR :) |
hao1939
pushed a commit
to hao1939/copilot-api
that referenced
this pull request
Nov 22, 2025
feature about anthropic count token and fix token usage problem and update claude code settings
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request introduces major improvements to token counting and model compatibility for Anthropic and OpenAI-style chat and tool calls. The most significant changes include a complete rewrite of the token counting logic to support multiple model tokenizers, improved handling of tool tokens, and new endpoints for Anthropic-compatible token counting. Several translation functions now properly account for cached tokens, ensuring more accurate usage reporting.
Token counting and model compatibility improvements:
src/lib/tokenizer.tsto support multiple GPT encoding schemes, more accurate token counting for messages and tools, and dynamic model-based constants. The new implementation allows for flexible token calculation across Anthropic and OpenAI models, including tool call and parameter support.Anthropic endpoint and translation enhancements:
/v1/messages/count_tokensendpoint, with handlerhandleCountTokensinsrc/routes/messages/count-tokens-handler.ts, to accurately compute input tokens for Anthropic requests, including model-specific adjustments and tool bonuses. [1] [2] [3] [4]non-stream-translation.tsandstream-translation.tsto subtract cached tokens from input token counts and include detailed cache token usage in the response, improving Anthropic usage reporting. [1] [2] [3]Model and payload type improvements:
Modelinterface insrc/services/copilot/get-models.tsto support new capabilities and ensure compatibility with the updated tokenizer.ChatCompletionResponsetype insrc/services/copilot/create-chat-completions.tsto includeprompt_tokens_details, allowing reporting of cached token usage.Configuration and setup updates:
README.mdandsrc/start.tsto include new environment variables for model selection and non-essential traffic disabling, supporting the expanded Anthropic model options. [1] [2]