You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatFunctionsCall/Stream/ChatFunctionsCallStreamProvider.swift
Copy file name to clipboardExpand all lines: Examples/SwiftOpenAIExample/SwiftOpenAIExample/ChatStreamFluidConversationDemo/ChatFluidConversationProvider.swift
Copy file name to clipboardExpand all lines: Sources/OpenAI/Public/ResponseModels/Chat/ChatCompletionChunkObject.swift
+13-13Lines changed: 13 additions & 13 deletions
Original file line number
Diff line number
Diff line change
@@ -13,29 +13,29 @@ public struct ChatCompletionChunkObject: Decodable {
13
13
/// A unique identifier for the chat completion chunk.
14
14
publicletid:String?
15
15
/// A list of chat completion choices. Can be more than one if n is greater than 1.
16
-
publicletchoices:[ChatChoice]
16
+
publicletchoices:[ChatChoice]?
17
17
/// The Unix timestamp (in seconds) of when the chat completion chunk was created.
18
-
publicletcreated:Int
18
+
publicletcreated:Int?
19
19
/// The model to generate the completion.
20
-
publicletmodel:String
20
+
publicletmodel:String?
21
21
/// The service tier used for processing the request. This field is only included if the service_tier parameter is specified in the request.
22
22
publicletserviceTier:String?
23
23
/// This fingerprint represents the backend configuration that the model runs with.
24
24
/// Can be used in conjunction with the seed request parameter to understand when backend changes have been made that might impact determinism.
25
25
publicletsystemFingerprint:String?
26
26
/// The object type, which is always chat.completion.chunk.
27
-
publicletobject:String
27
+
publicletobject:String?
28
28
/// An optional field that will only be present when you set stream_options: {"include_usage": true} in your request. When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request.
29
29
publicletusage:ChatUsage?
30
30
31
31
publicstructChatChoice:Decodable{
32
32
33
33
/// A chat completion delta generated by streamed model responses.
34
-
publicletdelta:Delta
34
+
publicletdelta:Delta?
35
35
/// The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, length if the maximum number of tokens specified in the request was reached, content_filter if content was omitted due to a flag from our content filters, tool_calls if the model called a tool, or function_call (deprecated) if the model called a function.
36
36
publicletfinishReason:IntOrStringValue?
37
37
/// The index of the choice in the list of choices.
38
-
publicletindex:Int
38
+
publicletindex:Int?
39
39
/// Provided by the Vision API.
40
40
publicletfinishDetails:FinishDetails?
41
41
/// Log probability information for the choice.
@@ -69,18 +69,18 @@ public struct ChatCompletionChunkObject: Decodable {
69
69
70
70
publicstructLogProb:Decodable{
71
71
/// A list of message content tokens with log probability information.
72
-
letcontent:[TokenDetail]
72
+
letcontent:[TokenDetail]?
73
73
}
74
74
75
75
publicstructTokenDetail:Decodable{
76
76
/// The token.
77
-
lettoken:String
77
+
lettoken:String?
78
78
/// The log probability of this token.
79
-
letlogprob:Double
79
+
letlogprob:Double?
80
80
/// A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.
81
81
letbytes:[Int]?
82
82
/// List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested top_logprobs returned.
83
-
lettopLogprobs:[TopLogProb]
83
+
lettopLogprobs:[TopLogProb]?
84
84
85
85
enumCodingKeys:String,CodingKey{
86
86
case token, logprob, bytes
@@ -89,17 +89,17 @@ public struct ChatCompletionChunkObject: Decodable {
89
89
90
90
structTopLogProb:Decodable{
91
91
/// The token.
92
-
lettoken:String
92
+
lettoken:String?
93
93
/// The log probability of this token.
94
-
letlogprob:Double
94
+
letlogprob:Double?
95
95
/// A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.
0 commit comments