Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
374 changes: 195 additions & 179 deletions .cursor/rules/convex_rules.mdc

Large diffs are not rendered by default.

23 changes: 0 additions & 23 deletions .github/workflows/node.js.yml

This file was deleted.

38 changes: 38 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
name: Test and lint
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

on:
push:
branches: [main]
pull_request:
branches: ["**"]

jobs:
check:
name: Test and lint
runs-on: ubuntu-latest
timeout-minutes: 30

steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5

- name: Node setup
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5
with:
cache-dependency-path: package.json
node-version: "20.x"
cache: "npm"

- name: Install and build
run: |
npm i
npm run build
- name: Publish package for testing branch
run: npx pkg-pr-new publish || echo "Have you set up pkg-pr-new for this repo?"
- name: Test
run: |
npm run test
npm run typecheck
npm run lint
6 changes: 2 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,8 @@ dist-ssr
explorations
node_modules
.eslintcache
# components are libraries!
package-lock.json

# this is a package-json-redirect stub dir, see https://github.com/andrewbranch/example-subpath-exports-ts-compat?tab=readme-ov-file
react/package.json

# npm pack output
*.tgz
*.tsbuildinfo
3 changes: 2 additions & 1 deletion .prettierrc.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
{
"trailingComma": "es5"
"trailingComma": "all",
"proseWrap": "always"
}
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Changelog

## 0.3.0

- Adds /test and /\_generated/component.js entrypoints
- Drops commonjs support
- Improves source mapping for generated files
- Changes to a statically generated component API
28 changes: 9 additions & 19 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,47 +4,37 @@

```sh
npm i
cd example
npm i
npx convex dev
npm run dev
```

## Testing

```sh
rm -rf dist/ && npm run build
npm run clean
npm run build
npm run typecheck
npm run test
cd example
npm run lint
cd ..
npm run test
```

## Deploying

### Building a one-off package

```sh
rm -rf dist/ && npm run build
npm run clean
npm ci
npm pack
```

### Deploying a new version

```sh
# this will change the version and commit it (if you run it in the root directory)
npm version patch
npm publish --dry-run
# sanity check files being included
npm publish
git push --tags
npm run release
```

#### Alpha release

The same as above, but it requires extra flags so the release is only installed with `@alpha`:
or for alpha release:

```sh
npm version prerelease --preid alpha
npm publish --tag alpha
npm run alpha
```
72 changes: 37 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,29 +4,30 @@

<!-- START: Include on https://convex.dev/components -->

This Convex component enables persistent text streaming. It provides a React hook
for streaming text from HTTP actions while simultaneously storing the data in the
database. This persistence allows the text to be accessed after the stream ends
or by other users.
This Convex component enables persistent text streaming. It provides a React
hook for streaming text from HTTP actions while simultaneously storing the data
in the database. This persistence allows the text to be accessed after the
stream ends or by other users.

The most common use case is for AI chat applications. The example app (found in the
`example` directory) is a just such a simple chat app that demonstrates use of the
component.
The most common use case is for AI chat applications. The example app (found in
the `example` directory) is a just such a simple chat app that demonstrates use
of the component.

Here's what you'll end up with! The left browser window is streaming the chat body to the client,
and the right browser window is subscribed to the chat body via a database query. The
message is only updated in the database on sentence boundaries, whereas the HTTP
stream sends tokens as they come:
Here's what you'll end up with! The left browser window is streaming the chat
body to the client, and the right browser window is subscribed to the chat body
via a database query. The message is only updated in the database on sentence
boundaries, whereas the HTTP stream sends tokens as they come:

![example-animation](./anim.gif)

## Pre-requisite: Convex

You'll need an existing Convex project to use the component.
Convex is a hosted backend platform, including a database, serverless functions,
and a ton more you can learn about [here](https://docs.convex.dev/get-started).
You'll need an existing Convex project to use the component. Convex is a hosted
backend platform, including a database, serverless functions, and a ton more you
can learn about [here](https://docs.convex.dev/get-started).

Run `npm create convex` or follow any of the [quickstarts](https://docs.convex.dev/home) to set one up.
Run `npm create convex` or follow any of the
[quickstarts](https://docs.convex.dev/home) to set one up.

## Installation

Expand All @@ -44,7 +45,7 @@ npm install @convex-dev/persistent-text-streaming
```ts
// convex/convex.config.ts
import { defineApp } from "convex/server";
import persistentTextStreaming from "@convex-dev/persistent-text-streaming/convex.config";
import persistentTextStreaming from "@convex-dev/persistent-text-streaming/convex.config.js";

const app = defineApp();
app.use(persistentTextStreaming);
Expand All @@ -59,7 +60,7 @@ In `convex/chat.ts`:

```ts
const persistentTextStreaming = new PersistentTextStreaming(
components.persistentTextStreaming
components.persistentTextStreaming,
);

// Create a stream using the component and store the id in the database with
Expand Down Expand Up @@ -87,15 +88,15 @@ export const getChatBody = query({
handler: async (ctx, args) => {
return await persistentTextStreaming.getStreamBody(
ctx,
args.streamId as StreamId
args.streamId as StreamId,
);
},
});

// Create an HTTP action that generates chunks of the chat body
// and uses the component to stream them to the client and save them to the database.
export const streamChat = httpAction(async (ctx, request) => {
const body = (await request.json()) as {streamId: string};
const body = (await request.json()) as { streamId: string };
const generateChat = async (ctx, request, streamId, chunkAppender) => {
await chunkAppender("Hi there!");
await chunkAppender("How are you?");
Expand All @@ -106,7 +107,7 @@ export const streamChat = httpAction(async (ctx, request) => {
ctx,
request,
body.streamId as StreamId,
generateChat
generateChat,
);

// Set CORS headers appropriately.
Expand All @@ -126,8 +127,8 @@ http.route({
});
```

Finally, in your app, you can now create chats and them subscribe to them
via stream and/or database query as optimal:
Finally, in your app, you can now create chats and them subscribe to them via
stream and/or database query as optimal:

```ts
// chat-input.tsx, maybe?
Expand All @@ -149,7 +150,7 @@ const { text, status } = useStream(
api.chat.getChatBody, // The query to call for the full stream body
new URL(`${convexSiteUrl}/chat-stream`), // The HTTP endpoint for streaming
driven, // True if this browser session created this chat and should generate the stream
chat.streamId as StreamId // The streamId from the chat database record
chat.streamId as StreamId, // The streamId from the chat database record
);
```

Expand All @@ -162,28 +163,29 @@ let's examine each approach in isolation.
- **HTTP streaming only**: If your app _only_ uses HTTP streaming, then the
original browser that made the request will have a great, high-performance
streaming experience. But if that HTTP connection is lost, if the browser
window is reloaded, if other users want to view the same chat, or this
users wants to revisit the conversation later, it won't be possible. The
window is reloaded, if other users want to view the same chat, or this users
wants to revisit the conversation later, it won't be possible. The
conversation is only ephemeral because it was never stored on the server.

- **Database Persistence Only**: If your app _only_ uses database persistence,
it's true that the conversation will be available for as long as you want.
Additionally, Convex's subscriptions will ensure the chat message is updated
as new text chunks are generated. However, there are a few downsides: one,
the entire chat body needs to be resent every time it is changed, which is a
lot redundant bandwidth to push into the database and over the websockets to
all connected clients. Two, you'll need to make a difficult tradeoff between
as new text chunks are generated. However, there are a few downsides: one, the
entire chat body needs to be resent every time it is changed, which is a lot
redundant bandwidth to push into the database and over the websockets to all
connected clients. Two, you'll need to make a difficult tradeoff between
interactivity and efficiency. If you write every single small chunk to the
database, this will get quite slow and expensive. But if you batch up the chunks
into, say, paragraphs, then the user experience will feel laggy.
database, this will get quite slow and expensive. But if you batch up the
chunks into, say, paragraphs, then the user experience will feel laggy.

This component combines the best of both worlds. The original browser that
makes the request will still have a great, high-performance streaming experience.
But the chat body is also stored in the database, so it can be accessed by the
This component combines the best of both worlds. The original browser that makes
the request will still have a great, high-performance streaming experience. But
the chat body is also stored in the database, so it can be accessed by the
client even after the stream has finished, or by other users, etc.

## Background

This component is largely based on the Stack post [AI Chat with HTTP Streaming](https://stack.convex.dev/ai-chat-with-http-streaming).
This component is largely based on the Stack post
[AI Chat with HTTP Streaming](https://stack.convex.dev/ai-chat-with-http-streaming).

<!-- END: Include on https://convex.dev/components -->
8 changes: 0 additions & 8 deletions commonjs.json

This file was deleted.

7 changes: 7 additions & 0 deletions convex.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"$schema": "./node_modules/convex/schemas/convex.schema.json",
"functions": "example/convex",
"codegen": {
"legacyComponentApi": false
}
}
Loading