Skip to content

Commit 7018370

Browse files
committed
Add Assitants README fixes
1 parent 89b72ba commit 7018370

File tree

1 file changed

+14
-15
lines changed

1 file changed

+14
-15
lines changed

README.md

Lines changed: 14 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -342,12 +342,12 @@ puts response.dig("choices", 0, "message", "content")
342342
343343
#### JSON Mode
344344
345-
You can set the response_format to ask for responses in JSON (at least for `gpt-3.5-turbo-1106`):
345+
You can set the response_format to ask for responses in JSON:
346346
347347
```ruby
348348
response = client.chat(
349349
parameters: {
350-
model: "gpt-3.5-turbo-1106",
350+
model: "gpt-3.5-turbo",
351351
response_format: { type: "json_object" },
352352
messages: [{ role: "user", content: "Hello! Give me some JSON please."}],
353353
temperature: 0.7,
@@ -367,7 +367,7 @@ You can stream it as well!
367367
```ruby
368368
response = client.chat(
369369
parameters: {
370-
model: "gpt-3.5-turbo-1106",
370+
model: "gpt-3.5-turbo",
371371
messages: [{ role: "user", content: "Can I have some JSON please?"}],
372372
response_format: { type: "json_object" },
373373
stream: proc do |chunk, _bytesize|
@@ -564,7 +564,7 @@ These files are in JSONL format, with each line representing the output or error
564564
"id": "chatcmpl-abc123",
565565
"object": "chat.completion",
566566
"created": 1677858242,
567-
"model": "gpt-3.5-turbo-0301",
567+
"model": "gpt-3.5-turbo",
568568
"choices": [
569569
{
570570
"index": 0,
@@ -660,16 +660,19 @@ To create a new assistant:
660660
```ruby
661661
response = client.assistants.create(
662662
parameters: {
663-
model: "gpt-3.5-turbo-1106", # Retrieve via client.models.list. Assistants need 'gpt-3.5-turbo-1106' or later.
663+
model: "gpt-3.5-turbo",
664664
name: "OpenAI-Ruby test assistant",
665665
description: nil,
666-
instructions: "You are a helpful assistant for coding a OpenAI API client using the OpenAI-Ruby gem.",
666+
instructions: "You are a Ruby dev bot. When asked a question, write and run Ruby code to answer the question",
667667
tools: [
668-
{ type: 'retrieval' }, # Allow access to files attached using file_ids
669-
{ type: 'code_interpreter' }, # Allow access to Python code interpreter
668+
{ type: "code_interpreter" },
670669
],
671-
"file_ids": ["file-123"], # See Files section above for how to upload files
672-
"metadata": { my_internal_version_id: '1.0.0' }
670+
tool_resources: {
671+
"code_interpreter": {
672+
"file_ids": [] # See Files section above for how to upload files
673+
}
674+
},
675+
"metadata": { my_internal_version_id: "1.0.0" }
673676
})
674677
assistant_id = response["id"]
675678
```
@@ -851,11 +854,7 @@ client.runs.list(thread_id: thread_id, parameters: { order: "asc", limit: 3 })
851854
You can also create a thread and run in one call like this:
852855

853856
```ruby
854-
response = client.threads.create_and_run(
855-
parameters: {
856-
model: 'gpt-3.5-turbo',
857-
messages: [{ role: 'user', content: "What's deep learning?"}]
858-
})
857+
response = client.runs.create_thread_and_run(parameters: { assistant_id: assistant_id })
859858
run_id = response['id']
860859
thread_id = response['thread_id']
861860
```

0 commit comments

Comments
 (0)