Skip to content

Commit fa5f3db

Browse files
committed
update readme to describe how to use functions
1 parent a956cd8 commit fa5f3db

File tree

1 file changed

+89
-1
lines changed

1 file changed

+89
-1
lines changed

README.md

Lines changed: 89 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,88 @@ local response = chat:send("What's the most boring color?", function(chunk)
8282
end)
8383
```
8484

85+
## Chat Session With Functions
86+
87+
OpenAI allows [sending a list of function
88+
declarations](https://openai.com/blog/function-calling-and-other-api-updates)
89+
that the LLM can decide to call based on the prompt. The function calling
90+
interface must be used with chat completions and the `gpt-4-0613` or
91+
`gpt-3.5-turbo-0613` models or later.
92+
93+
Here's a quick example of how to use functions in a chat exchange. First you
94+
will need to create a chat session with the `functions` option containing an
95+
array of available functions.
96+
97+
> The functions are stored on the `functions` field on the chat object. If the
98+
> functions need to be adjusted for future message, the field can be modified.
99+
100+
```lua
101+
local chat = openai:new_chat_session({
102+
model = "gpt-3.5-turbo-0613",
103+
functions = {
104+
{
105+
name = "add",
106+
description = "Add two numbers together",
107+
parameters = {
108+
type = "object",
109+
properties = {
110+
a = { type = "number" },
111+
b = { type = "number" }
112+
}
113+
}
114+
}
115+
}
116+
})
117+
```
118+
119+
Any prompt you send will be aware of all available functions, and may request
120+
any of them to be called. If the response contains a function call request,
121+
then an object will be returned instead of the standard string return value.
122+
123+
```lua
124+
local res = chat:send("Using the provided function, calculate the sum of 2923 + 20839")
125+
126+
if type(res) == "table" and res.function_call then
127+
-- The function_call object has the following fields:
128+
-- function_call.name --> name of function to be called
129+
-- function_call.arguments --> A string in JSON format that should match the parameter specification
130+
-- Note that res may also include a content field if the LLM produced a textual output as well
131+
132+
local cjson = require "cjson"
133+
local arguments = cjson.decode(res.function_call.arguments)
134+
call_my_function(res.function_call.name, arguments)
135+
end
136+
```
137+
138+
Finally, you can evaluate the function and send the result back to the client
139+
so it can resume operation:
140+
141+
> Since the LLM can hallucinate every part of the function call, you'll want to
142+
> do robust type validation to ensure that function name and arguments match
143+
> what you expect.
144+
145+
```lua
146+
local name, arguments = ... -- the name and arguments extracted from above
147+
148+
if name == "add" then
149+
local value = arguments.a + arguments.b
150+
151+
-- send the response back to the chat bot using a `role = function` message
152+
153+
local cjson = require "cjson"
154+
155+
local res = chat:send({
156+
role = "function",
157+
name = name,
158+
content = cjson.encode(value)
159+
})
160+
161+
print(res) -- Print the final output
162+
else
163+
error("Unknown function: " .. name)
164+
end
165+
```
166+
85167
## Streaming Response Example
86168

87169
Under normal circumstances the API will wait until the entire response is
@@ -202,6 +284,7 @@ Constructor for the ChatSession.
202284
- `client`: An instance of the OpenAI client.
203285
- `opts`: An optional table of options.
204286
- `messages`: An initial array of chat messages
287+
- `functions`: A list of function declarations
205288
- `temperature`: temperature setting
206289
- `model`: Which chat completion model to use, eg. `gpt-4`, `gpt-3.5-turbo`
207290

@@ -221,10 +304,15 @@ Appends a message to the chat history and triggers a completion with
221304
`generate_response` and returns the response as a string. On failure, returns
222305
`nil`, an error message, and the raw request response.
223306

307+
If the response includes a `function_call`, then the entire message object is
308+
returned instead of a string of the content. You can return the result of the
309+
function by passing `role = "function"` object to the `send` method
310+
224311
- `message`: A message object or a string.
225312
- `stream_callback`: (optional) A function to enable streaming output.
226313

227-
By providing a `stream_callback`, the request will runin streaming mode. This function receives chunks as they are parsed from the response.
314+
By providing a `stream_callback`, the request will runin streaming mode. This
315+
function receives chunks as they are parsed from the response.
228316

229317
These chunks have the following format:
230318

0 commit comments

Comments
 (0)