You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Batches endpoint allows you to create and manage large batches of API requests to run asynchronously. Currently, only the `/v1/chat/completions` endpoint is supported for batches.
493
+
494
+
To use the Batches endpoint, you need to first upload a JSONL file containing the batch requests using the Files endpoint. The file must be uploaded with the purpose set to `batch`. Each line in the JSONL file represents a single request and should have the following format:
495
+
496
+
```json
497
+
{
498
+
"custom_id": "request-1",
499
+
"method": "POST",
500
+
"url": "/v1/chat/completions",
501
+
"body": {
502
+
"model": "gpt-3.5-turbo",
503
+
"messages": [
504
+
{ "role": "system", "content": "You are a helpful assistant." },
505
+
{ "role": "user", "content": "What is 2+2?" }
506
+
]
507
+
}
508
+
}
509
+
```
510
+
511
+
Once you have uploaded the JSONL file, you can create a new batch by providing the file ID, endpoint, and completion window:
512
+
513
+
```ruby
514
+
response = client.batches.create(
515
+
parameters: {
516
+
input_file_id:"file-abc123",
517
+
endpoint:"/v1/chat/completions",
518
+
completion_window:"24h"
519
+
}
520
+
)
521
+
batch_id = response["id"]
522
+
```
523
+
524
+
You can retrieve information about a specific batch using its ID:
525
+
526
+
```ruby
527
+
batch = client.batches.retrieve(id: batch_id)
528
+
```
529
+
530
+
To cancel a batch that is in progress:
531
+
532
+
```ruby
533
+
client.batches.cancel(id: batch_id)
534
+
```
535
+
536
+
You can also list all the batches:
537
+
538
+
```ruby
539
+
client.batches.list
540
+
```
541
+
542
+
Once the batch["completed_at"] is present, you can fetch the output or error files:
0 commit comments