Skip to content

Commit 883f2f5

Browse files
committed
README adjustment
Signed-off-by: Matteo Collina <hello@matteocollina.com>
1 parent 9d163ae commit 883f2f5

File tree

1 file changed

+136
-160
lines changed

1 file changed

+136
-160
lines changed

README.md

Lines changed: 136 additions & 160 deletions
Original file line numberDiff line numberDiff line change
@@ -12,37 +12,27 @@ Then you can:
1212

1313
## Features
1414

15-
- Consume messages from Kafka topics and forward to HTTP endpoints.
16-
- Send messages to Kafka topics via HTTP API.
17-
- **Request/Response pattern over Kafka topics.**
18-
- Direct binary message passing.
19-
- Configurable retries and concurrency.
20-
- Dead Letter Queue (DLQ) for failed messages.
15+
- **Consume to HTTP**: Consume messages from Kafka topics and forward to HTTP endpoints
16+
- **HTTP to Produce**: Send messages to Kafka topics via HTTP API
17+
- **Request/Response pattern over Kafka topics**: Build HTTP-style request/response patterns routed through Kafka
18+
- **Direct binary message passing**: Pass message content directly without custom serialization
19+
- **Configurable retries and concurrency**: Handle failures with customizable retry logic and parallel processing
20+
- **Dead Letter Queue (DLQ)**: Failed messages are sent to DLQ topics for later inspection
21+
- **Path parameters and query strings**: Automatic handling of URL parameters via Kafka headers
22+
- **Error handling**: Comprehensive timeout and error management
2123

22-
## Request/Response Pattern
23-
24-
The kafka-hooks library supports HTTP request/response patterns routed through Kafka topics. This enables building responsive microservices that communicate asynchronously via Kafka while maintaining HTTP-style request/response semantics.
25-
26-
### How It Works
27-
28-
1. **HTTP Request**: Client makes a POST request to a configured endpoint
29-
2. **Kafka Request**: The request is published to a Kafka request topic with a unique correlation ID
30-
3. **Service Processing**: External service consumes from the request topic, processes the message
31-
4. **Kafka Response**: Service publishes response to a response topic with the same correlation ID
32-
5. **HTTP Response**: The original HTTP request completes with the response data
33-
34-
### Configuration
24+
## Configuration
3525

36-
Add request/response mappings to your `platformatic.json`:
26+
Configure your Kafka webhooks in the `platformatic.json` file:
3727

3828
```json
3929
{
4030
"kafka": {
4131
"brokers": ["localhost:9092"],
4232
"topics": [
4333
{
44-
"topic": "response-topic",
45-
"url": "http://localhost:3043/webhook"
34+
"topic": "events",
35+
"url": "https://service.example.com"
4636
}
4737
],
4838
"requestResponse": [
@@ -54,13 +44,43 @@ Add request/response mappings to your `platformatic.json`:
5444
}
5545
],
5646
"consumer": {
57-
"groupId": "my-group"
58-
}
47+
"groupId": "plt-kafka-hooks",
48+
"maxWaitTime": 500,
49+
"sessionTimeout": 10000,
50+
"rebalanceTimeout": 15000,
51+
"heartbeatInterval": 500
52+
},
53+
"concurrency": 10
5954
}
6055
}
6156
```
6257

63-
### Request/Response Options
58+
### Core Options
59+
60+
| Option | Description | Default |
61+
| --------------- | -------------------------------------------------------------------------------------------------- | ------- |
62+
| `brokers` | The list of Kafka brokers in the form `host:port`. | Required |
63+
| `consumer` | Any option supported by a [@platformatic/kafka](https://github.com/platformatic/kafka) `Consumer`. | None |
64+
| `concurrency` | How many messages to process in parallel. | `10` |
65+
66+
### Topics Configuration
67+
68+
Each item in the `topics` array supports the following options:
69+
70+
| Option | Description | Default |
71+
| -------------------------- | ---------------------------------------------------------------------------------------------------------- | --------------------- |
72+
| `topic` | The topic to consume messages from. | Required |
73+
| `url` | The URL to send messages to. | Required |
74+
| `method` | The HTTP method to use when hitting the URL above. | `POST` |
75+
| `headers` | Additional headers to send in the request. | None |
76+
| `retries` | How many times to try the request before marking as failed. | `3` |
77+
| `retryDelay` | How much to wait between retries, in milliseconds. | `1000` (1 second) |
78+
| `dlq` | The DLQ (Dead-Letter-Queue) topic to forward failed messages to. Set to `false` to disable. | `plt-kafka-hooks-dlq` |
79+
| `includeAttemptInRequests` | If to include the current attempt number in the requests in the `x-plt-kafka-hooks-attempt` header. | `true` |
80+
81+
### Request/Response Configuration
82+
83+
Each item in the `requestResponse` array supports these options:
6484

6585
| Option | Description | Default |
6686
| --------------- | ----------------------------------------------------- | ---------- |
@@ -69,7 +89,90 @@ Add request/response mappings to your `platformatic.json`:
6989
| `responseTopic` | Kafka topic to consume responses from | Required |
7090
| `timeout` | Request timeout in milliseconds | `30000` |
7191

72-
### Path Parameters and Query Strings
92+
### Dead Letter Queue (DLQ)
93+
94+
When a message fails to be delivered after the configured number of retries, it's sent to a Dead Letter Queue (DLQ) topic for later inspection or processing.
95+
96+
By default, failed messages are sent to the `plt-kafka-hooks-dlq` topic. You can:
97+
98+
- Change the DLQ topic name by setting the `dlq` option in the topic configuration
99+
- Disable DLQ entirely by setting `dlq: false` in the topic configuration
100+
101+
```json
102+
{
103+
"kafka": {
104+
"topics": [
105+
{
106+
"topic": "events",
107+
"url": "https://service.example.com",
108+
"dlq": "custom-dlq-topic" // Custom DLQ topic name
109+
},
110+
{
111+
"topic": "notifications",
112+
"url": "https://service.example.com/notifications",
113+
"dlq": false // Disable DLQ for this topic
114+
}
115+
]
116+
}
117+
}
118+
```
119+
120+
#### DLQ Message Format
121+
122+
Messages sent to the DLQ contain detailed information about the failure:
123+
124+
```json
125+
{
126+
"key": "original-message-key",
127+
"value": "base64-encoded-original-message",
128+
"headers": {
129+
"original-header-key": "original-header-value"
130+
},
131+
"topic": "original-topic",
132+
"partition": 0,
133+
"offset": "1234",
134+
"errors": [
135+
{
136+
"statusCode": 500,
137+
"error": "Internal Server Error",
138+
"message": "Failed to process message"
139+
}
140+
],
141+
"retries": 3
142+
}
143+
```
144+
145+
The original message value is preserved as a base64-encoded string to maintain its exact binary content.
146+
147+
## APIs
148+
149+
### HTTP to Kafka Publishing
150+
151+
Publish messages to Kafka topics via HTTP POST requests:
152+
153+
```bash
154+
curl --request POST \
155+
--url http://127.0.0.1:3042/topics/topic \
156+
--header 'Content-Type: application/json' \
157+
--header 'x-plt-kafka-hooks-key: my-key' \
158+
--data '{ "name": "my test" }'
159+
```
160+
161+
If `x-plt-kafka-hooks-key` is omitted, then the message will have no key in Kafka.
162+
163+
### Request/Response Pattern
164+
165+
The kafka-hooks library supports HTTP request/response patterns routed through Kafka topics. This enables building responsive microservices that communicate asynchronously via Kafka while maintaining HTTP-style request/response semantics.
166+
167+
#### How It Works
168+
169+
1. **HTTP Request**: Client makes a POST request to a configured endpoint
170+
2. **Kafka Request**: The request is published to a Kafka request topic with a unique correlation ID
171+
3. **Service Processing**: External service consumes from the request topic, processes the message
172+
4. **Kafka Response**: Service publishes response to a response topic with the same correlation ID
173+
5. **HTTP Response**: The original HTTP request completes with the response data
174+
175+
#### Path Parameters and Query Strings
73176

74177
The request/response pattern supports both path parameters and query strings, which are automatically passed to Kafka consumers via headers.
75178

@@ -108,7 +211,7 @@ curl -X POST http://localhost:3042/api/search?q=coffee&limit=10&sort=price \
108211
}
109212
```
110213

111-
### Usage Example
214+
#### Usage Example
112215

113216
**Make a request:**
114217
```bash
@@ -146,7 +249,7 @@ curl -X POST http://localhost:3042/topics/response-topic \
146249
}
147250
```
148251

149-
### Request Headers
252+
#### Request Headers
150253

151254
Request messages automatically include these headers when published to Kafka:
152255

@@ -157,7 +260,7 @@ Request messages automatically include these headers when published to Kafka:
157260
| `x-plt-kafka-hooks-query-string` | JSON string of query string parameters | When query parameters present |
158261
| `content-type` | Content type of the request | Always |
159262

160-
### Response Headers
263+
#### Response Headers
161264

162265
Response messages support these special headers:
163266

@@ -167,7 +270,7 @@ Response messages support these special headers:
167270
| `x-status-code` | HTTP status code for the response | `200` |
168271
| `content-type` | Content type of the response | Preserved |
169272

170-
### Error Handling
273+
#### Error Handling
171274

172275
**Timeout Response:**
173276
If no response is received within the configured timeout:
@@ -186,7 +289,7 @@ Responses without correlation IDs are logged as warnings and ignored.
186289
**No Pending Request:**
187290
Responses for non-existent correlation IDs are logged as warnings and ignored.
188291

189-
### Use Cases
292+
#### Use Cases
190293

191294
- **Microservice Communication**: Route requests through Kafka for reliable delivery
192295
- **Async Processing**: Handle long-running tasks with HTTP-like interface
@@ -206,25 +309,11 @@ npx platformatic start
206309

207310
You can then edit your `.env` file and configure the `PLT_KAFKA_BROKER` env variable to select your Kafka broker.
208311

209-
## API Tutorial
210-
211-
To publish a message to Kafka:
212-
213-
```
214-
curl --request POST \
215-
--url http://127.0.0.1:3042/topics/topic \
216-
--header 'Content-Type: application/json' \
217-
--header 'x-plt-kafka-hooks-key: my-key' \
218-
--data '{ "name": "my test" }'
219-
```
220-
221-
If `x-plt-kafka-hooks-key` is omitted, then the message will have no key in Kafka.
222-
223312
### Requirements
224313

225-
You'll need a Kafka server running. If you don't have one, you can this `docker-compose.yml` file as a starter:
314+
You'll need a Kafka server running. If you don't have one, you can use this `docker-compose.yml` file as a starter:
226315

227-
```
316+
```yaml
228317
---
229318
services:
230319
kafka:
@@ -251,119 +340,6 @@ services:
251340
KAFKA_LOG_DIRS: '/tmp/kraft-combined-logs'
252341
```
253342
254-
## Configuration
255-
256-
Configure your Kafka webhooks in the `platformatic.json` file:
257-
258-
```json
259-
{
260-
"kafka": {
261-
"brokers": ["localhost:9092"],
262-
"topics": [
263-
{
264-
"topic": "events",
265-
"url": "https://service.example.com"
266-
}
267-
],
268-
"consumer": {
269-
"groupId": "plt-kafka-hooks",
270-
"maxWaitTime": 500,
271-
"sessionTimeout": 10000,
272-
"rebalanceTimeout": 15000,
273-
"heartbeatInterval": 500
274-
}
275-
}
276-
}
277-
```
278-
279-
### Topics configuration
280-
281-
Each item in the `topics` array supports the following options:
282-
283-
| Option | Description | Default |
284-
| -------------------------- | ---------------------------------------------------------------------------------------------------------- | --------------------- |
285-
| `topic` | The topic to consume messages from. | |
286-
| `dlq` | The DLQ (Dead-Letter-Queue) topic to forward failed messages to. It can be disabled by setting to `false`. | `plt-kafka-hooks-dlq` |
287-
| `url` | The URL to send messages to. | |
288-
| `method` | The method to use when hitting the URL above. | `POST` |
289-
| `headers` | Additional headers to send in the request. | |
290-
| `retries` | How many times to try the request before marking as failed. | `3` |
291-
| `retryDelay` | How much to wait between retries, in milliseconds. | `1000` (1 second) |
292-
| `includeAttemptInRequests` | If to include the current attempt number in the requests in the `x-plt-kafka-hooks-attempt` header. | `true` |
293-
294-
### Additional configurations
295-
296-
| Option | Description | Default |
297-
| --------------- | -------------------------------------------------------------------------------------------------- | ------- |
298-
| `brokers` | The list of Kafka brokers in the form `host:port`. | None |
299-
| `consumer` | Any option supported by a [@platformatic/kafka](https://github.com/platformatic/kafka) `Consumer`. | None |
300-
| `concurrency` | How many messages to process in parallel. | `10` |
301-
302-
## Dead Letter Queue (DLQ)
303-
304-
When a message fails to be delivered after the configured number of retries, it's sent to a Dead Letter Queue (DLQ) topic for later inspection or processing.
305-
306-
### DLQ Configuration
307-
308-
By default, failed messages are sent to the `plt-kafka-hooks-dlq` topic. You can:
309-
310-
- Change the DLQ topic name by setting the `dlq` option in the topic configuration
311-
- Disable DLQ entirely by setting `dlq: false` in the topic configuration
312-
313-
```json
314-
{
315-
"kafka": {
316-
"topics": [
317-
{
318-
"topic": "events",
319-
"url": "https://service.example.com",
320-
"dlq": "custom-dlq-topic" // Custom DLQ topic name
321-
},
322-
{
323-
"topic": "notifications",
324-
"url": "https://service.example.com/notifications",
325-
"dlq": false // Disable DLQ for this topic
326-
}
327-
]
328-
}
329-
}
330-
```
331-
332-
### DLQ Message Format
333-
334-
Messages sent to the DLQ contain detailed information about the failure:
335-
336-
```json
337-
{
338-
"key": "original-message-key",
339-
"value": "base64-encoded-original-message",
340-
"headers": {
341-
"original-header-key": "original-header-value"
342-
},
343-
"topic": "original-topic",
344-
"partition": 0,
345-
"offset": "1234",
346-
"errors": [
347-
{
348-
"statusCode": 500,
349-
"error": "Internal Server Error",
350-
"message": "Failed to process message"
351-
}
352-
],
353-
"retries": 3
354-
}
355-
```
356-
357-
The original message value is preserved as a base64-encoded string to maintain its exact binary content.
358-
359-
### Processing DLQ Messages
360-
361-
DLQ messages can be consumed and processed using standard Kafka consumer tools. When reprocessing, you may want to:
362-
363-
1. Decode the base64-encoded value
364-
2. Examine the errors to understand why processing failed
365-
3. Fix any issues and re-submit the message to the original topic
366-
367343
## License
368344
369345
Apache-2.0

0 commit comments

Comments
 (0)