Skip to content

Commit 2ff7196

Browse files
authored
feat: Updated to Watt 3. (#14)
* feat: Updated to Watt 3. Signed-off-by: Paolo Insogna <paolo@cowtech.it>
1 parent 2534572 commit 2ff7196

File tree

16 files changed

+2306
-866
lines changed

16 files changed

+2306
-866
lines changed
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
name: Publish release
2+
3+
on:
4+
workflow_dispatch:
5+
inputs:
6+
version:
7+
description: 'The version number to tag and release'
8+
required: true
9+
type: string
10+
prerelease:
11+
description: 'Release as pre-release'
12+
required: false
13+
type: boolean
14+
default: false
15+
16+
jobs:
17+
release-npm:
18+
runs-on: ubuntu-latest
19+
environment: main
20+
permissions:
21+
contents: write
22+
id-token: write
23+
steps:
24+
- name: Checkout
25+
uses: actions/checkout@v4
26+
- name: Use supported Node.js Version
27+
uses: actions/setup-node@v4
28+
with:
29+
node-version: ${{ matrix.node-version }}
30+
- name: Restore cached dependencies
31+
uses: actions/cache@v4
32+
with:
33+
path: ~/.pnpm-store
34+
key: node-modules-${{ hashFiles('package.json') }}
35+
- name: Setup pnpm
36+
uses: pnpm/action-setup@v4
37+
with:
38+
version: latest
39+
- name: Install dependencies
40+
run: pnpm install --frozen-lockfile
41+
- name: Change version number
42+
uses: mikefarah/yq@master
43+
with:
44+
cmd: yq -i '.version = "${{ inputs.version }}"' package.json
45+
- name: GIT commit and push all changed files
46+
run: |
47+
git config --global user.name "${{ github.actor }}"
48+
git config --global user.email "${{ github.actor }}@users.noreply.github.com"
49+
git commit -a -m "Bumped v${{ inputs.version }}"
50+
git push origin HEAD:${{ github.ref }}
51+
- run: npm publish --access public --tag ${{ inputs.prerelease == true && 'next' || 'latest' }}
52+
- name: 'Create release notes'
53+
run: |
54+
npx @matteo.collina/release-notes -a ${{ secrets.GITHUB_TOKEN }} -t v${{ inputs.version }} -r ${{ github.repository }} -o ${{ github.repository_owner }} ${{ github.event.inputs.prerelease == 'true' && '-p' || '' }} -c ${{ github.ref }}

README.md

Lines changed: 75 additions & 86 deletions
Original file line numberDiff line numberDiff line change
@@ -23,14 +23,6 @@ npx wattpm@latest create
2323

2424
And select `@platformatic/kafka-hooks` from the list of available packages.
2525

26-
## Install Standalone
27-
28-
```bash
29-
npx --package=@platformatic/kafka-hooks create-platformatic-kafka-hooks
30-
cd kafka-hooks-app
31-
npm install
32-
```
33-
3426
## Configuration
3527

3628
Configure your Kafka webhooks in the `platformatic.json` file:
@@ -67,37 +59,37 @@ Configure your Kafka webhooks in the `platformatic.json` file:
6759

6860
### Core Options
6961

70-
| Option | Description | Default |
71-
| --------------- | -------------------------------------------------------------------------------------------------- | ------- |
72-
| `brokers` | The list of Kafka brokers in the form `host:port`. | Required |
73-
| `consumer` | Any option supported by a [@platformatic/kafka](https://github.com/platformatic/kafka) `Consumer`. | None |
74-
| `concurrency` | How many messages to process in parallel. | `10` |
62+
| Option | Description | Default |
63+
| ------------- | -------------------------------------------------------------------------------------------------- | -------- |
64+
| `brokers` | The list of Kafka brokers in the form `host:port`. | Required |
65+
| `consumer` | Any option supported by a [@platformatic/kafka](https://github.com/platformatic/kafka) `Consumer`. | None |
66+
| `concurrency` | How many messages to process in parallel. | `10` |
7567

7668
### Topics Configuration
7769

7870
Each item in the `topics` array supports the following options:
7971

80-
| Option | Description | Default |
81-
| -------------------------- | ---------------------------------------------------------------------------------------------------------- | --------------------- |
82-
| `topic` | The topic to consume messages from. | Required |
83-
| `url` | The URL to send messages to. | Required |
84-
| `method` | The HTTP method to use when hitting the URL above. | `POST` |
85-
| `headers` | Additional headers to send in the request. | None |
86-
| `retries` | How many times to try the request before marking as failed. | `3` |
87-
| `retryDelay` | How much to wait between retries, in milliseconds. | `1000` (1 second) |
88-
| `dlq` | The DLQ (Dead-Letter-Queue) topic to forward failed messages to. Set to `false` to disable. | `plt-kafka-hooks-dlq` |
89-
| `includeAttemptInRequests` | If to include the current attempt number in the requests in the `x-plt-kafka-hooks-attempt` header. | `true` |
72+
| Option | Description | Default |
73+
| -------------------------- | --------------------------------------------------------------------------------------------------- | --------------------- |
74+
| `topic` | The topic to consume messages from. | Required |
75+
| `url` | The URL to send messages to. | Required |
76+
| `method` | The HTTP method to use when hitting the URL above. | `POST` |
77+
| `headers` | Additional headers to send in the request. | None |
78+
| `retries` | How many times to try the request before marking as failed. | `3` |
79+
| `retryDelay` | How much to wait between retries, in milliseconds. | `1000` (1 second) |
80+
| `dlq` | The DLQ (Dead-Letter-Queue) topic to forward failed messages to. Set to `false` to disable. | `plt-kafka-hooks-dlq` |
81+
| `includeAttemptInRequests` | If to include the current attempt number in the requests in the `x-plt-kafka-hooks-attempt` header. | `true` |
9082

9183
### Request/Response Configuration
9284

9385
Each item in the `requestResponse` array supports these options:
9486

95-
| Option | Description | Default |
96-
| --------------- | ----------------------------------------------------- | ---------- |
97-
| `path` | HTTP endpoint path to expose (supports path parameters) | Required |
98-
| `requestTopic` | Kafka topic to publish requests to | Required |
99-
| `responseTopic` | Kafka topic to consume responses from | Required |
100-
| `timeout` | Request timeout in milliseconds | `30000` |
87+
| Option | Description | Default |
88+
| --------------- | ------------------------------------------------------- | -------- |
89+
| `path` | HTTP endpoint path to expose (supports path parameters) | Required |
90+
| `requestTopic` | Kafka topic to publish requests to | Required |
91+
| `responseTopic` | Kafka topic to consume responses from | Required |
92+
| `timeout` | Request timeout in milliseconds | `30000` |
10193

10294
### Dead Letter Queue (DLQ)
10395

@@ -115,12 +107,12 @@ By default, failed messages are sent to the `plt-kafka-hooks-dlq` topic. You can
115107
{
116108
"topic": "events",
117109
"url": "https://service.example.com",
118-
"dlq": "custom-dlq-topic" // Custom DLQ topic name
110+
"dlq": "custom-dlq-topic" // Custom DLQ topic name
119111
},
120112
{
121113
"topic": "notifications",
122114
"url": "https://service.example.com/notifications",
123-
"dlq": false // Disable DLQ for this topic
115+
"dlq": false // Disable DLQ for this topic
124116
}
125117
]
126118
}
@@ -154,6 +146,37 @@ Messages sent to the DLQ contain detailed information about the failure:
154146

155147
The original message value is preserved as a base64-encoded string to maintain its exact binary content.
156148

149+
## Requirements
150+
151+
You'll need a Kafka server running. If you don't have one, you can use this `docker-compose.yml` file as a starter:
152+
153+
```yaml
154+
---
155+
services:
156+
kafka:
157+
image: apache/kafka:3.9.0
158+
ports:
159+
- '9092:9092'
160+
environment:
161+
_JAVA_OPTIONS: '-XX:UseSVE=0'
162+
KAFKA_NODE_ID: 1
163+
KAFKA_LISTENERS: 'CONTROLLER://:29093,PLAINTEXT://:19092,MAIN://:9092'
164+
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://kafka:19092,MAIN://localhost:9092'
165+
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: 'CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,MAIN:PLAINTEXT'
166+
KAFKA_PROCESS_ROLES: 'broker,controller'
167+
KAFKA_CONTROLLER_QUORUM_VOTERS: '1@kafka:29093'
168+
KAFKA_INTER_BROKER_LISTENER_NAME: 'PLAINTEXT'
169+
KAFKA_CONTROLLER_LISTENER_NAMES: 'CONTROLLER'
170+
CLUSTER_ID: '4L6g3nShT-eMCtK--X86sw'
171+
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
172+
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
173+
KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
174+
KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1
175+
KAFKA_SHARE_COORDINATOR_STATE_TOPIC_REPLICATION_FACTOR: 1
176+
KAFKA_SHARE_COORDINATOR_STATE_TOPIC_MIN_ISR: 1
177+
KAFKA_LOG_DIRS: '/tmp/kraft-combined-logs'
178+
```
179+
157180
## APIs
158181
159182
### HTTP to Kafka Publishing
@@ -187,34 +210,39 @@ The kafka-hooks library supports HTTP request/response patterns routed through K
187210
The request/response pattern supports both path parameters and query strings, which are automatically passed to Kafka consumers via headers.
188211

189212
**Path Parameters:**
213+
190214
```json
191215
{
192216
"path": "/api/users/:userId/orders/:orderId"
193217
}
194218
```
195219

196220
**Request with path parameters:**
221+
197222
```bash
198223
curl -X POST http://localhost:3042/api/users/123/orders/456 \
199224
-H "Content-Type: application/json" \
200225
-d '{"action": "cancel"}'
201226
```
202227

203228
**Kafka message headers include:**
229+
204230
```json
205231
{
206232
"x-plt-kafka-hooks-path-params": "{\"userId\":\"123\",\"orderId\":\"456\"}"
207233
}
208234
```
209235

210236
**Query String Parameters:**
237+
211238
```bash
212239
curl -X POST http://localhost:3042/api/search?q=coffee&limit=10&sort=price \
213240
-H "Content-Type: application/json" \
214241
-d '{"filters": {...}}'
215242
```
216243

217244
**Kafka message headers include:**
245+
218246
```json
219247
{
220248
"x-plt-kafka-hooks-query-string": "{\"q\":\"coffee\",\"limit\":\"10\",\"sort\":\"price\"}"
@@ -224,13 +252,15 @@ curl -X POST http://localhost:3042/api/search?q=coffee&limit=10&sort=price \
224252
#### Usage Example
225253

226254
**Make a request:**
255+
227256
```bash
228257
curl -X POST http://localhost:3042/api/process \
229258
-H "Content-Type: application/json" \
230259
-d '{"userId": 123, "action": "process"}'
231260
```
232261

233262
**Request message published to Kafka:**
263+
234264
```json
235265
{
236266
"headers": {
@@ -242,6 +272,7 @@ curl -X POST http://localhost:3042/api/process \
242272
```
243273

244274
**Service processes and responds:**
275+
245276
```bash
246277
# External service publishes response
247278
curl -X POST http://localhost:3042/topics/response-topic \
@@ -252,6 +283,7 @@ curl -X POST http://localhost:3042/topics/response-topic \
252283
```
253284

254285
**HTTP response returned to client:**
286+
255287
```json
256288
{
257289
"result": "success",
@@ -263,27 +295,28 @@ curl -X POST http://localhost:3042/topics/response-topic \
263295

264296
Request messages automatically include these headers when published to Kafka:
265297

266-
| Header | Description | When Added |
267-
| ------------------------------------ | ---------------------------------------------- | ---------- |
268-
| `x-plt-kafka-hooks-correlation-id` | Unique correlation ID for request matching | Always |
269-
| `x-plt-kafka-hooks-path-params` | JSON string of path parameters | When path parameters present |
270-
| `x-plt-kafka-hooks-query-string` | JSON string of query string parameters | When query parameters present |
271-
| `content-type` | Content type of the request | Always |
298+
| Header | Description | When Added |
299+
| ---------------------------------- | ------------------------------------------ | ----------------------------- |
300+
| `x-plt-kafka-hooks-correlation-id` | Unique correlation ID for request matching | Always |
301+
| `x-plt-kafka-hooks-path-params` | JSON string of path parameters | When path parameters present |
302+
| `x-plt-kafka-hooks-query-string` | JSON string of query string parameters | When query parameters present |
303+
| `content-type` | Content type of the request | Always |
272304

273305
#### Response Headers
274306

275307
Response messages support these special headers:
276308

277-
| Header | Description | Default |
278-
| ------------------------------------ | ---------------------------------------------- | ------- |
279-
| `x-plt-kafka-hooks-correlation-id` | Must match the original request correlation ID | Required|
280-
| `x-status-code` | HTTP status code for the response | `200` |
281-
| `content-type` | Content type of the response | Preserved |
309+
| Header | Description | Default |
310+
| ---------------------------------- | ---------------------------------------------- | --------- |
311+
| `x-plt-kafka-hooks-correlation-id` | Must match the original request correlation ID | Required |
312+
| `x-status-code` | HTTP status code for the response | `200` |
313+
| `content-type` | Content type of the response | Preserved |
282314

283315
#### Error Handling
284316

285317
**Timeout Response:**
286318
If no response is received within the configured timeout:
319+
287320
```json
288321
{
289322
"code": "HTTP_ERROR_GATEWAY_TIMEOUT",
@@ -306,50 +339,6 @@ Responses for non-existent correlation IDs are logged as warnings and ignored.
306339
- **Event-Driven APIs**: Build responsive APIs on event-driven architecture
307340
- **Service Decoupling**: Maintain HTTP contracts while decoupling services via Kafka
308341

309-
## Standalone Install & Setup
310-
311-
You can generate a standalone application with:
312-
313-
```bash
314-
npx --package @platformatic/kafka-hooks -c create-platformatic-kafka-hooks
315-
cd kafka-hooks-app
316-
npm i
317-
npx platformatic start
318-
```
319-
320-
You can then edit your `.env` file and configure the `PLT_KAFKA_BROKER` env variable to select your Kafka broker.
321-
322-
### Requirements
323-
324-
You'll need a Kafka server running. If you don't have one, you can use this `docker-compose.yml` file as a starter:
325-
326-
```yaml
327-
---
328-
services:
329-
kafka:
330-
image: apache/kafka:3.9.0
331-
ports:
332-
- '9092:9092'
333-
environment:
334-
_JAVA_OPTIONS: '-XX:UseSVE=0'
335-
KAFKA_NODE_ID: 1
336-
KAFKA_LISTENERS: 'CONTROLLER://:29093,PLAINTEXT://:19092,MAIN://:9092'
337-
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://kafka:19092,MAIN://localhost:9092'
338-
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: 'CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,MAIN:PLAINTEXT'
339-
KAFKA_PROCESS_ROLES: 'broker,controller'
340-
KAFKA_CONTROLLER_QUORUM_VOTERS: '1@kafka:29093'
341-
KAFKA_INTER_BROKER_LISTENER_NAME: 'PLAINTEXT'
342-
KAFKA_CONTROLLER_LISTENER_NAMES: 'CONTROLLER'
343-
CLUSTER_ID: '4L6g3nShT-eMCtK--X86sw'
344-
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
345-
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
346-
KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
347-
KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1
348-
KAFKA_SHARE_COORDINATOR_STATE_TOPIC_REPLICATION_FACTOR: 1
349-
KAFKA_SHARE_COORDINATOR_STATE_TOPIC_MIN_ISR: 1
350-
KAFKA_LOG_DIRS: '/tmp/kraft-combined-logs'
351-
```
352-
353342
## License
354343

355344
Apache-2.0

cli/create.js

Lines changed: 0 additions & 42 deletions
This file was deleted.

cli/start.js

Lines changed: 0 additions & 7 deletions
This file was deleted.

0 commit comments

Comments
 (0)