diff --git a/docs/gen-ai/gen-ai-events.md b/docs/gen-ai/gen-ai-events.md index 1c0d6e0de9..e3b1a40c0a 100644 --- a/docs/gen-ai/gen-ai-events.md +++ b/docs/gen-ai/gen-ai-events.md @@ -122,7 +122,7 @@ The event name MUST be `gen_ai.assistant.message`. |--------------|--------------------------------|----------------------------------------|-------------------------------------------------|-------------------| | `role` | string | The actual role of the message author as passed in the message. | `"assistant"`, `"bot"` | `Conditionally Required`: if available and if not equal to `assistant` | | `content` | `AnyValue` | The contents of the assistant message. | `The weather in Paris is rainy and overcast, with temperatures around 57°F` | `Opt-In` | -| `tool_calls` | [ToolCall](#toolcall-object)[] | The tool calls generated by the model, such as function calls. | `[{"id":"call_mszuSIzqtI65i1wAUOE8w5H4", "function":{"name":"get_weather", "arguments":{"location":"Paris"}}, "type":"function"}]` | `Conditionally Required`: if available | +| `tool_calls` | [ToolCall](#toolcall-object)[] | The tool calls generated by the model, such as function calls. | `[{"id":"call_mszuSIzqtI65i1wAUOE8w5H4", "function":{"name":"get_weather", "arguments":"{\"location\":\"Paris\"}"}, "type":"function"}]` | `Conditionally Required`: if available | ### `ToolCall` object @@ -130,14 +130,14 @@ The event name MUST be `gen_ai.assistant.message`. |------------|-----------------------------|------------------------------------|-------------------------------------------------|-------------------| | `id` | string | The id of the tool call | `call_mszuSIzqtI65i1wAUOE8w5H4` | `Required` | | `type` | string | The type of the tool | `function` | `Required` | -| `function` | [Function](#function-object)| The function that the model called | `{"name":"get_weather", "arguments":{"location":"Paris"}}` | `Required` | +| `function` | [Function](#function-object)| The function that the model called | `{"name":"get_weather", "arguments":"{\"location\":\"Paris\"}"}` | `Required` | ### `Function` object | Body Field | Type | Description | Examples | [Requirement Level](https://opentelemetry.io/docs/specs/semconv/general/attribute-requirement-level/) | |-------------|------------|----------------------------------------|----------------------------|-------------------| | `name` | string | The name of the function to call | `get_weather` | `Required` | -| `arguments` | `AnyValue` | The arguments to pass the the function | `{"location": "Paris"}` | `Opt-In` | +| `arguments` | `AnyValue` | The arguments to pass the the function | `{\"location\":\"Paris\"}` | `Opt-In` | ## Tool event @@ -175,7 +175,7 @@ Choice event body has the following fields: |----------------|--------------------------------|-----------------------------------------------|---------------------------------|-------------------| | `role` | string | The actual role of the message author as passed in the message. | `"assistant"`, `"bot"` | `Conditionally Required`: if available and if not equal to `assistant` | | `content` | `AnyValue` | The contents of the assistant message. | `The weather in Paris is rainy and overcast, with temperatures around 57°F` | `Opt-In` | -| `tool_calls` | [ToolCall](#toolcall-object)[] | The tool calls generated by the model, such as function calls. | `[{"id":"call_mszuSIzqtI65i1wAUOE8w5H4", "function":{"name":"get_weather", "arguments":{"location":"Paris"}}, "type":"function"}]` | `Conditionally Required`: if available | +| `tool_calls` | [ToolCall](#toolcall-object)[] | The tool calls generated by the model, such as function calls. | `[{"id":"call_mszuSIzqtI65i1wAUOE8w5H4", "function":{"name":"get_weather", "arguments":"{\"location\":\"Paris\"}"}, "type":"function"}]` | `Conditionally Required`: if available | ## Custom events @@ -309,7 +309,7 @@ Here's the telemetry generated for each step in this scenario: | Property | Value | |---------------------|-------------------------------------------------------| | `gen_ai.system` | `"openai"` | - | Event body (with content) | `{"index":0,"finish_reason":"tool_calls","message":{"tool_calls":[{"id":"call_VSPygqKTWdrhaFErNvMV18Yl","function":{"name":"get_weather","arguments":{"location":"Paris"}},"type":"function"}]}` | + | Event body (with content) | `{"index":0,"finish_reason":"tool_calls","message":{"tool_calls":[{"id":"call_VSPygqKTWdrhaFErNvMV18Yl","function":{"name":"get_weather","arguments":"{\"location\":\"Paris\"}"},"type":"function"}]}` | | Event body (without content) | `{"index":0,"finish_reason":"tool_calls","message":{"tool_calls":[{"id":"call_VSPygqKTWdrhaFErNvMV18Yl","function":{"name":"get_weather"},"type":"function"}]}` | **GenAI Client span 2:** @@ -344,7 +344,7 @@ Here's the telemetry generated for each step in this scenario: | Property | Value | |----------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------| | `gen_ai.system` | `"openai"` | - | Event body (content enabled) | `{"tool_calls":[{"id":"call_VSPygqKTWdrhaFErNvMV18Yl","function":{"name":"get_weather","arguments":{"location":"Paris"}},"type":"function"}]}` | + | Event body (content enabled) | `{"tool_calls":[{"id":"call_VSPygqKTWdrhaFErNvMV18Yl","function":{"name":"get_weather","arguments":"{\"location\":\"Paris\"}"},"type":"function"}]}` | | Event body (content not enabled) | `{"tool_calls":[{"id":"call_VSPygqKTWdrhaFErNvMV18Yl","function":{"name":"get_weather"},"type":"function"}]}` | 3. `gen_ai.tool.message` diff --git a/model/gen-ai/events.yaml b/model/gen-ai/events.yaml index c819d5a409..669a2b16bb 100644 --- a/model/gen-ai/events.yaml +++ b/model/gen-ai/events.yaml @@ -134,8 +134,14 @@ groups: type: undefined stability: experimental brief: > - The arguments of the function. - examples: ['{"location": "Paris"}'] + The arguments of the function as provided in the LLM response. + note: > + Models usually return arguments as a JSON string. In this case, it's + RECOMMENDED to provide arguments as is without attempting to deserialize them. + + Semantic conventions for individual systems MAY specify a different type for + arguments field. + examples: ['{\"location\": \"Paris\"}'] requirement_level: opt_in - id: gen_ai.tool.message @@ -286,5 +292,12 @@ groups: type: undefined stability: experimental brief: > - The arguments of the function. + The arguments of the function as provided in the LLM response. + note: > + Models usually return arguments as a JSON string. In this case, it's + RECOMMENDED to provide arguments as is without attempting to deserialize them. + + Semantic conventions for individual systems MAY specify a different type for + arguments field. + examples: ['{\"location\": \"Paris\"}'] requirement_level: opt_in