Skip to content

Commit

Permalink
Add stream_options param; add usage in ChatCompletionChunk
Browse files Browse the repository at this point in the history
  • Loading branch information
brianz-openai committed Apr 23, 2024
1 parent afe0b7c commit 2b7d46e
Showing 1 changed file with 39 additions and 0 deletions.
39 changes: 39 additions & 0 deletions openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7145,6 +7145,17 @@ components:
type: boolean
nullable: true
default: false
stream_options:
description: Options for stream response
type: object
nullable: true
default: null
properties:
include_usage:
type: boolean
description: |
If set, an extra chunk will be returned before the `data: [Done]` message with a `usage` field that shows usage data for all chunks in the streamed request.
All other chunks will also include a `usage` field but with a null value. In the case of an error, the extra (usage) chunk might be missing.
suffix:
description: |
The suffix that comes after a completion of inserted text.
Expand Down Expand Up @@ -7822,6 +7833,17 @@ components:
type: boolean
nullable: true
default: false
stream_options:
description: Options for stream response
type: object
nullable: true
default: null
properties:
include_usage:
type: boolean
description: |
If set, an extra chunk will be returned before the `data: [Done]` message with a `usage` field that shows usage data for all chunks in the streamed request.
All other chunks will also include a `usage` field but with a null value. In the case of an error, the extra (usage) chunk might be missing.
temperature:
type: number
minimum: 0
Expand Down Expand Up @@ -8122,6 +8144,23 @@ components:
type: string
description: The object type, which is always `chat.completion.chunk`.
enum: [chat.completion.chunk]
usage:
type: object
description: Usage statistics for the streamed completion request. Value is null except for the last chunk.
properties:
completion_tokens:
type: integer
description: Number of tokens in the generated completion.
prompt_tokens:
type: integer
description: Number of tokens in the prompt.
total_tokens:
type: integer
description: Total number of tokens used in the request (prompt + completion).
required:
- prompt_tokens
- completion_tokens
- total_tokens
required:
- choices
- created
Expand Down

0 comments on commit 2b7d46e

Please sign in to comment.