Skip to content

Commit 708ae33

Browse files
committed
feat: enhance streaming response handling with ping mechanism
1 parent 44551f9 commit 708ae33

File tree

2 files changed

+29
-12
lines changed

2 files changed

+29
-12
lines changed

README.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -184,11 +184,12 @@ The server exposes several endpoints to interact with the Copilot API. It provid
184184

185185
These endpoints mimic the OpenAI API structure.
186186

187-
| Endpoint | Method | Description |
188-
| --------------------------- | ------ | --------------------------------------------------------- |
189-
| `POST /v1/chat/completions` | `POST` | Creates a model response for the given chat conversation. |
190-
| `GET /v1/models` | `GET` | Lists the currently available models. |
191-
| `POST /v1/embeddings` | `POST` | Creates an embedding vector representing the input text. |
187+
| Endpoint | Method | Description |
188+
| --------------------------- | ------ | ---------------------------------------------------------------- |
189+
| `POST /v1/responses` | `POST` | Most advanced interface for generating model responses. |
190+
| `POST /v1/chat/completions` | `POST` | Creates a model response for the given chat conversation. |
191+
| `GET /v1/models` | `GET` | Lists the currently available models. |
192+
| `POST /v1/embeddings` | `POST` | Creates an embedding vector representing the input text. |
192193

193194
### Anthropic Compatible Endpoints
194195

src/routes/responses/handler.ts

Lines changed: 23 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -52,13 +52,29 @@ export const handleResponses = async (c: Context) => {
5252
if (isStreamingRequested(payload) && isAsyncIterable(response)) {
5353
consola.debug("Forwarding native Responses stream")
5454
return streamSSE(c, async (stream) => {
55-
for await (const chunk of response) {
56-
consola.debug("Responses stream chunk:", JSON.stringify(chunk))
57-
await stream.writeSSE({
58-
id: (chunk as { id?: string }).id,
59-
event: (chunk as { event?: string }).event,
60-
data: (chunk as { data?: string }).data ?? "",
61-
})
55+
const pingInterval = setInterval(async () => {
56+
try {
57+
await stream.writeSSE({
58+
event: "ping",
59+
data: JSON.stringify({ timestamp: Date.now() }),
60+
})
61+
} catch (error) {
62+
consola.warn("Failed to send ping:", error)
63+
clearInterval(pingInterval)
64+
}
65+
}, 3000)
66+
67+
try {
68+
for await (const chunk of response) {
69+
consola.debug("Responses stream chunk:", JSON.stringify(chunk))
70+
await stream.writeSSE({
71+
id: (chunk as { id?: string }).id,
72+
event: (chunk as { event?: string }).event,
73+
data: (chunk as { data?: string }).data ?? "",
74+
})
75+
}
76+
} finally {
77+
clearInterval(pingInterval)
6278
}
6379
})
6480
}

0 commit comments

Comments
 (0)