Skip to content

Commit df6f663

Browse files
authored
Merge pull request #579 from alexrudall/feat/responses-endpoints
Add other Responses endpoints
2 parents 9e236ff + ccb3e0e commit df6f663

16 files changed

+877
-151
lines changed

README.md

Lines changed: 25 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -468,28 +468,28 @@ You can stream it as well!
468468
### Responses API
469469
OpenAI's most advanced interface for generating model responses. Supports text and image inputs, and text outputs. Create stateful interactions with the model, using the output of previous responses as input. Extend the model's capabilities with built-in tools for file search, web search, computer use, and more. Allow the model access to external systems and data using function calling.
470470

471+
#### Create a Response
471472
```ruby
472-
response = client.responses(parameters: {
473+
response = client.responses.create(parameters: {
473474
model: "gpt-4o",
474475
input: "Hello!"
475476
})
476-
477477
puts response.dig("output", 0, "content", 0, "text")
478478
```
479-
#### Follow-up Messages (former threads functionality available in the Assistant API)
479+
480+
#### Follow-up Messages
480481
```ruby
481-
followup = client.responses(parameters: {
482+
followup = client.responses.create(parameters: {
482483
model: "gpt-4o",
483484
input: "Remind me, what is my name?",
484485
previous_response_id: response["id"]
485486
})
486-
487487
puts followup.dig("output", 0, "content", 0, "text")
488488
```
489489

490490
#### Tool Calls
491491
```ruby
492-
response = client.responses(parameters: {
492+
response = client.responses.create(parameters: {
493493
model: "gpt-4o",
494494
input: "What's the weather in Paris?",
495495
tools: [
@@ -510,29 +510,43 @@ response = client.responses(parameters: {
510510
}
511511
]
512512
})
513-
514513
puts response.dig("output", 0, "name") # => "get_current_weather"
515514
```
516515

517516
#### Streaming
518517
```ruby
519518
chunks = []
520519
streamer = proc { |chunk, _| chunks << chunk }
521-
522-
client.responses(parameters: {
520+
client.responses.create(parameters: {
523521
model: "gpt-4o",
524522
input: "Hello!",
525523
stream: streamer
526524
})
527-
528525
output = chunks
529526
.select { |c| c["type"] == "response.output_text.delta" }
530527
.map { |c| c["delta"] }
531528
.join
532-
533529
puts output
534530
```
535531

532+
#### Retrieve a Response
533+
```ruby
534+
retrieved_response = client.responses.retrieve(response_id: response["id"])
535+
puts retrieved_response["object"] # => "response"
536+
```
537+
538+
#### Delete a Response
539+
```ruby
540+
deletion = client.responses.delete(response_id: response["id"])
541+
puts deletion["deleted"] # => true
542+
```
543+
544+
#### List Input Items
545+
```ruby
546+
input_items = client.responses.input_items(response_id: response["id"])
547+
puts input_items["object"] # => "list"
548+
```
549+
536550
### Functions
537551

538552
You can describe and pass in functions and the model will intelligently choose to output a JSON object containing arguments to call them - eg., to use your method `get_current_weather` to get the weather in a given location. Note that tool_choice is optional, but if you exclude it, the model will choose whether to use the function or not ([see here](https://platform.openai.com/docs/api-reference/chat/create#chat-create-tool_choice)).

lib/openai.rb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
require_relative "openai/finetunes"
88
require_relative "openai/images"
99
require_relative "openai/models"
10+
require_relative "openai/responses"
1011
require_relative "openai/assistants"
1112
require_relative "openai/threads"
1213
require_relative "openai/messages"

lib/openai/client.rb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,10 +32,6 @@ def completions(parameters: {})
3232
json_post(path: "/completions", parameters: parameters)
3333
end
3434

35-
def responses(parameters: {})
36-
json_post(path: "/responses", parameters: parameters)
37-
end
38-
3935
def audio
4036
@audio ||= OpenAI::Audio.new(client: self)
4137
end
@@ -56,6 +52,10 @@ def models
5652
@models ||= OpenAI::Models.new(client: self)
5753
end
5854

55+
def responses
56+
@responses ||= OpenAI::Responses.new(client: self)
57+
end
58+
5959
def assistants
6060
@assistants ||= OpenAI::Assistants.new(client: self)
6161
end

lib/openai/responses.rb

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
module OpenAI
2+
class Responses
3+
def initialize(client:)
4+
@client = client
5+
end
6+
7+
def create(parameters: {})
8+
@client.json_post(path: "/responses", parameters: parameters)
9+
end
10+
11+
def retrieve(response_id:)
12+
@client.get(path: "/responses/#{response_id}")
13+
end
14+
15+
def delete(response_id:)
16+
@client.delete(path: "/responses/#{response_id}")
17+
end
18+
19+
def input_items(response_id:, parameters: {})
20+
@client.get(path: "/responses/#{response_id}/input_items", parameters: parameters)
21+
end
22+
end
23+
end

spec/fixtures/cassettes/gpt-4o_responsesapi_responses.yml renamed to spec/fixtures/cassettes/responses_create.yml

Lines changed: 12 additions & 12 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 29 additions & 29 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)