Skip to content

Commit c75a166

Browse files
committed
feat: add activity dashboard and live telemetry
1 parent 641fb1e commit c75a166

29 files changed

+4283
-307
lines changed

bun.lock

Lines changed: 5 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

logbook.md

Lines changed: 125 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,125 @@
1+
# Logbook
2+
3+
This branch was backfilled against `master`, which is the default branch in this repo. There is no local `main` branch.
4+
5+
- Branch point: `0ea08fe` (`chore: release v0.7.0`)
6+
- Backfill window: 2026-03-08 through 2026-03-16
7+
- Current head at backfill time: `641fb1e` (`Fix Bun streaming request timeouts`)
8+
- Net branch diff vs `master`: 29 files changed, 2659 insertions, 33 deletions
9+
10+
## 2026-03-08
11+
12+
### Responses API support and model-level routing
13+
14+
Commits:
15+
16+
- `42214b8` `feat: port PR #205 — Responses API support and model-level routing`
17+
- `b188c93` `fix: address review issues in Responses API implementation`
18+
- `e408560` `feat: add gpt-5.4 with full level support via Responses API`
19+
- `26cbf7f` `chore: update bun.lock`
20+
21+
What changed:
22+
23+
- Added OpenAI Responses API endpoints at `/responses` and `/v1/responses`.
24+
- Added upstream Responses client support in `src/services/copilot/create-responses.ts`.
25+
- Added translation between chat completions and responses models in `src/routes/chat-completions/responses-translation.ts`.
26+
- Introduced model-level routing in `src/lib/model-level.ts`, including support for level suffixes and `gpt-5.4`.
27+
- Updated model listing and routing so Responses-capable models are exposed and handled correctly.
28+
- Followed up with review fixes and a lockfile refresh.
29+
30+
### Request and session usage logging
31+
32+
Commits:
33+
34+
- `8f115dc` `feat: add per-request and session usage logging`
35+
- `716170c` `feat: include thinking level in request log`
36+
- `d77b3aa` `fix: pick up thinking level from reasoning_effort/reasoning.effort fields`
37+
- `4cb8bca` `fix: pick up reasoning.effort level in responses route`
38+
39+
What changed:
40+
41+
- Added `src/lib/request-log.ts` to centralize request accounting.
42+
- Extended `src/lib/state.ts` with per-model session usage counters.
43+
- Logged token usage for chat completions, Anthropic messages, and Responses API flows.
44+
- Added thinking/reasoning level extraction so usage logs reflect `reasoning_effort` and `reasoning.effort`.
45+
- Tightened the Responses route so it records reasoning level consistently.
46+
47+
## 2026-03-14
48+
49+
### Opt-in proxy exchange capture
50+
51+
Commits:
52+
53+
- `eac1024` `feat: add opt-in proxy exchange capture`
54+
- `5647f81` `Merge branch 'feat/proxy-exchange-capture' into dev`
55+
56+
What changed:
57+
58+
- Added `--capture` and `--capture-path` to the `start` command in `src/start.ts`.
59+
- Added `src/lib/exchange-capture.ts` to persist request/response exchanges as JSONL.
60+
- Captures include request metadata, upstream IDs, request bodies, response bodies, and token usage across chat, embeddings, messages, and responses routes.
61+
- Added redaction of sensitive values before persistence.
62+
- Added automatic gzip compression for older daily capture files under the default capture directory.
63+
- Added supporting path/state plumbing in `src/lib/paths.ts`, `src/lib/state.ts`, and `src/lib/upstream-trace.ts`.
64+
- Documented the feature in `README.md` and the RFC at `docs/rfcs/opt-in-proxy-exchange-capture.md`.
65+
- Added focused test coverage in `tests/exchange-capture.test.ts`.
66+
67+
## 2026-03-16
68+
69+
### SSE forwarding hardening
70+
71+
Commit:
72+
73+
- `9e2a4e6` `Harden SSE stream forwarding`
74+
75+
What changed:
76+
77+
- Added `src/lib/sse.ts` to normalize and forward upstream SSE frames safely.
78+
- Fixed handling for metadata-only SSE events so events without `data` do not crash or get dropped incorrectly.
79+
- Hardened streaming behavior in chat completions and responses handlers.
80+
- Added test coverage in `tests/responses-route-stream.test.ts` and `tests/sse.test.ts`.
81+
82+
### Bun streaming request timeout handling
83+
84+
Commit:
85+
86+
- `641fb1e` `Fix Bun streaming request timeouts`
87+
88+
What changed:
89+
90+
- Added startup-side idle-timeout parsing and validation in `src/start.ts`.
91+
- Corrected the configurable Bun idle-timeout range to `0-255`.
92+
- Added `src/lib/request-timeout.ts` to disable Bun request timeouts on streaming requests.
93+
- Applied per-request timeout disabling for streaming chat completions, Anthropic messages, and Responses API routes.
94+
- Documented the timeout behavior in `README.md`.
95+
- Added tests in `tests/request-timeout.test.ts` and `tests/start-idle-timeout.test.ts`.
96+
97+
## 2026-03-17
98+
99+
### Activity dashboard and telemetry hardening
100+
101+
What changed:
102+
103+
- Added a built-in activity dashboard at `/dashboard` with summary cards, quota snapshots, a full-width live log, and a locally served Chart.js history plot.
104+
- Added a SQLite-backed dashboard telemetry store with backfill from persisted JSONL and gzipped capture files plus an SSE live event stream.
105+
- Captured telemetry for chat completions, Anthropic messages, Responses API, and embeddings, including completed requests, upstream errors, proxy errors, and client aborts.
106+
- Changed the history chart to stacked bars by base model so hourly, daily, weekly, and monthly buckets are visible in one plot with a legend.
107+
- Moved dashboard controls into the history panel and removed the separate by-model pane in favor of the chart breakdown.
108+
- Replaced the default Hono request logger with `consola` request start/finish lines to align terminal logging with the existing session token logs.
109+
- Hardened dashboard telemetry persistence so SQLite I/O failures degrade the dashboard store to in-memory mode instead of breaking proxied requests.
110+
- Added dashboard coverage in `tests/dashboard.test.ts`, including API aggregation, backfill, live stream, and local chart asset checks.
111+
112+
## Commit Trail
113+
114+
- `42214b8` 2026-03-08 `feat: port PR #205 — Responses API support and model-level routing`
115+
- `b188c93` 2026-03-08 `fix: address review issues in Responses API implementation`
116+
- `e408560` 2026-03-08 `feat: add gpt-5.4 with full level support via Responses API`
117+
- `8f115dc` 2026-03-08 `feat: add per-request and session usage logging`
118+
- `716170c` 2026-03-08 `feat: include thinking level in request log`
119+
- `d77b3aa` 2026-03-08 `fix: pick up thinking level from reasoning_effort/reasoning.effort fields`
120+
- `4cb8bca` 2026-03-08 `fix: pick up reasoning.effort level in responses route`
121+
- `26cbf7f` 2026-03-08 `chore: update bun.lock`
122+
- `eac1024` 2026-03-14 `feat: add opt-in proxy exchange capture`
123+
- `5647f81` 2026-03-14 `Merge branch 'feat/proxy-exchange-capture' into dev`
124+
- `9e2a4e6` 2026-03-16 `Harden SSE stream forwarding`
125+
- `641fb1e` 2026-03-16 `Fix Bun streaming request timeouts`

package.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,7 @@
4040
"*": "bun run lint --fix"
4141
},
4242
"dependencies": {
43+
"chart.js": "^4.5.1",
4344
"citty": "^0.1.6",
4445
"clipboardy": "^5.0.0",
4546
"consola": "^3.4.2",

src/lib/approval.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,5 +11,6 @@ export const awaitApproval = async () => {
1111
throw new HTTPError(
1212
"Request rejected",
1313
Response.json({ message: "Request rejected" }, { status: 403 }),
14+
"proxy",
1415
)
1516
}

0 commit comments

Comments
 (0)