Skip to content

Commit a4f4b9c

Browse files
committed
Update README
1 parent 9de6a16 commit a4f4b9c

File tree

1 file changed

+25
-11
lines changed

1 file changed

+25
-11
lines changed

README.md

Lines changed: 25 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -468,28 +468,28 @@ You can stream it as well!
468468
### Responses API
469469
OpenAI's most advanced interface for generating model responses. Supports text and image inputs, and text outputs. Create stateful interactions with the model, using the output of previous responses as input. Extend the model's capabilities with built-in tools for file search, web search, computer use, and more. Allow the model access to external systems and data using function calling.
470470

471+
#### Create a Response
471472
```ruby
472-
response = client.responses(parameters: {
473+
response = client.responses.create(parameters: {
473474
model: "gpt-4o",
474475
input: "Hello!"
475476
})
476-
477477
puts response.dig("output", 0, "content", 0, "text")
478478
```
479-
#### Follow-up Messages (former threads functionality available in the Assistant API)
479+
480+
#### Follow-up Messages
480481
```ruby
481-
followup = client.responses(parameters: {
482+
followup = client.responses.create(parameters: {
482483
model: "gpt-4o",
483484
input: "Remind me, what is my name?",
484485
previous_response_id: response["id"]
485486
})
486-
487487
puts followup.dig("output", 0, "content", 0, "text")
488488
```
489489

490490
#### Tool Calls
491491
```ruby
492-
response = client.responses(parameters: {
492+
response = client.responses.create(parameters: {
493493
model: "gpt-4o",
494494
input: "What's the weather in Paris?",
495495
tools: [
@@ -510,29 +510,43 @@ response = client.responses(parameters: {
510510
}
511511
]
512512
})
513-
514513
puts response.dig("output", 0, "name") # => "get_current_weather"
515514
```
516515

517516
#### Streaming
518517
```ruby
519518
chunks = []
520519
streamer = proc { |chunk, _| chunks << chunk }
521-
522-
client.responses(parameters: {
520+
client.responses.create(parameters: {
523521
model: "gpt-4o",
524522
input: "Hello!",
525523
stream: streamer
526524
})
527-
528525
output = chunks
529526
.select { |c| c["type"] == "response.output_text.delta" }
530527
.map { |c| c["delta"] }
531528
.join
532-
533529
puts output
534530
```
535531

532+
#### Retrieve a Response
533+
```ruby
534+
retrieved_response = client.responses.retrieve(response_id: response["id"])
535+
puts retrieved_response["object"] # => "response"
536+
```
537+
538+
#### Delete a Response
539+
```ruby
540+
deletion = client.responses.delete(response_id: response["id"])
541+
puts deletion["deleted"] # => true
542+
```
543+
544+
#### List Input Items
545+
```ruby
546+
input_items = client.responses.input_items(response_id: response["id"])
547+
puts input_items["object"] # => "list"
548+
```
549+
536550
### Functions
537551

538552
You can describe and pass in functions and the model will intelligently choose to output a JSON object containing arguments to call them - eg., to use your method `get_current_weather` to get the weather in a given location. Note that tool_choice is optional, but if you exclude it, the model will choose whether to use the function or not ([see here](https://platform.openai.com/docs/api-reference/chat/create#chat-create-tool_choice)).

0 commit comments

Comments
 (0)