Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI: instrument embeddings and chat completions #2398

Merged
merged 77 commits into from
Feb 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
0c348d0
OpenAI instrumentation scaffold
kaylareopelle Jan 18, 2024
0d9e3fd
Rubocop
kaylareopelle Jan 18, 2024
9fb700c
Instrument json_post method
kaylareopelle Jan 23, 2024
11bf4d4
Draft json_post instrumentation to get response header attrs
kaylareopelle Jan 23, 2024
3a9b12a
Cleanup
kaylareopelle Jan 24, 2024
c372d90
OpenAI instrumentation scaffold
kaylareopelle Jan 18, 2024
543e92c
Rubocop
kaylareopelle Jan 18, 2024
90e5d97
Instrument json_post method
kaylareopelle Jan 23, 2024
cc5b2cc
Draft json_post instrumentation to get response header attrs
kaylareopelle Jan 23, 2024
8ec46d9
Cleanup
kaylareopelle Jan 24, 2024
9be9588
Sketch out some attribute assignments
kaylareopelle Jan 30, 2024
53864e5
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Jan 30, 2024
6854a8b
Add chat completion event params
hannahramadan Jan 30, 2024
adf18cc
Document embeddings attributes
hannahramadan Jan 30, 2024
2c352b8
Add embedding event details
hannahramadan Jan 31, 2024
76cb29d
Add attributes for messages
kaylareopelle Jan 31, 2024
42adee3
Test hash to assign correct string values to custom event attributes
kaylareopelle Jan 31, 2024
477053f
Update attribute names to period-delimited strings
kaylareopelle Jan 31, 2024
cadb30e
Default LlmEvent ID value to guid, allow passed-in arg to override
kaylareopelle Jan 31, 2024
118dbda
Separate instrumentation methods
hannahramadan Jan 31, 2024
da7692d
Add conversation_id from transaction custom attributes
kaylareopelle Jan 31, 2024
3146677
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Jan 31, 2024
c179385
Move post-response operations out of begin block
hannahramadan Jan 31, 2024
e1a70c9
Minor instrumentation refactors and comments
kaylareopelle Feb 1, 2024
4cbcc61
Add multiverse tests for openai requests
kaylareopelle Feb 1, 2024
bf0eecf
Add more tests for chat completions
kaylareopelle Feb 2, 2024
00e0453
Raise in Net::HTTP for the unwritten tests related to OpenAI
kaylareopelle Feb 2, 2024
b6bd502
Rubocop
kaylareopelle Feb 2, 2024
b84945c
Create AI monitoring suite
kaylareopelle Feb 2, 2024
b57e021
Update spelling for OpenAI constant
kaylareopelle Feb 2, 2024
d0386d6
Merge branch 'dev' into openai
kaylareopelle Feb 2, 2024
be65cc5
Create set_llm_agent_attribute_on_error_transaction
kaylareopelle Feb 2, 2024
6377fcf
Add llm agent attribute to transactions with OpenAI segments
kaylareopelle Feb 2, 2024
8ffcd47
Misc cleanup
kaylareopelle Feb 5, 2024
004dcd3
Move response organization and request id to ResponseHeaders
kaylareopelle Feb 5, 2024
36d9e83
Be prepared for String or Symbol
hannahramadan Feb 5, 2024
9a75463
Rubocop cleanup
hannahramadan Feb 5, 2024
2be5e44
Pluralize embedding-relate methods
hannahramadan Feb 5, 2024
d083888
Pass name keyword arg to embedding start segment
kaylareopelle Feb 5, 2024
208b971
Add embedding tests
hannahramadan Feb 6, 2024
6136eab
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Feb 6, 2024
b2b166d
Rubocop
hannahramadan Feb 6, 2024
d40461a
WIP commit refactor for multiple version support
kaylareopelle Feb 7, 2024
d627f15
WIP update tests for 3.4.0 error handling
kaylareopelle Feb 7, 2024
e23c734
WIP Fix tests for errors in 3.4.0
kaylareopelle Feb 7, 2024
5ffbf9e
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Feb 7, 2024
4c60bae
Support 3.4.0+
kaylareopelle Feb 7, 2024
98c2b29
Test fix for Ruby 2.4
hannahramadan Feb 7, 2024
2451bdf
Update comments in Envfile
kaylareopelle Feb 7, 2024
8a10359
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Feb 7, 2024
cadfc35
llm_event attribute on txn test
hannahramadan Feb 7, 2024
5fed88f
use symbols v strings for testing
hannahramadan Feb 7, 2024
8cb0072
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Feb 8, 2024
fd1401c
Add "Supportability" to the OpenAI metric
kaylareopelle Feb 8, 2024
a8687b7
Addtl supportability metric updates
kaylareopelle Feb 8, 2024
3dac7e0
Add setup for tests, llm_event test refactor
hannahramadan Feb 8, 2024
585327d
Update OpenAI segment names
kaylareopelle Feb 10, 2024
f15925d
Refactor supportability metric
hannahramadan Feb 12, 2024
8ac6b0f
Add missing params test & code change
hannahramadan Feb 12, 2024
c2c9959
appease rubocop
hannahramadan Feb 12, 2024
b486d23
Remove unused method
hannahramadan Feb 12, 2024
9985865
Merge branch 'openai' of github.com:newrelic/newrelic-ruby-agent into…
kaylareopelle Feb 12, 2024
7b1484d
Update Net::HTTP AI methods to be OpenAI-specific
kaylareopelle Feb 12, 2024
f79a2f6
Add chat completions segment name constant back
kaylareopelle Feb 12, 2024
55dd343
Add openai to jruby CI
kaylareopelle Feb 12, 2024
7273d1f
Code review
kaylareopelle Feb 13, 2024
c83a892
Only send llm attributes to DST_TRANSACTION_EVENTS
hannahramadan Feb 13, 2024
36d6cab
Conditonally merge depending on Ruby version
hannahramadan Feb 13, 2024
4e26f74
Remove tests that look for a txn in error collector
hannahramadan Feb 13, 2024
85f7d51
Don't run flaky test
hannahramadan Feb 13, 2024
22968ea
Remove unit test assert for txn found in error collector
hannahramadan Feb 13, 2024
90b7ce9
Update capitalization for OpenAI in segment names
kaylareopelle Feb 13, 2024
0359ae1
Remove placeholder Net::HTTP tests for response headers
kaylareopelle Feb 13, 2024
d69d404
Rubocop
kaylareopelle Feb 13, 2024
cdd42f6
is_response: true - code feedback
hannahramadan Feb 13, 2024
a8092e0
Update lib/new_relic/agent/instrumentation/ruby_openai/instrumentatio…
kaylareopelle Feb 14, 2024
d43b1a6
Freeze openai regex
hannahramadan Feb 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ jobs:
strategy:
fail-fast: false
matrix:
multiverse: [agent, background, background_2, database, frameworks, httpclients, httpclients_2, rails, rest]
multiverse: [agent, ai, background, background_2, database, frameworks, httpclients, httpclients_2, rails, rest]
ruby-version: [2.4.10, 3.3.0]

steps:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci_cron.yml
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ jobs:
strategy:
fail-fast: false
matrix:
multiverse: [agent, background, background_2, database, frameworks, httpclients, httpclients_2, rails, rest]
multiverse: [agent, ai, background, background_2, database, frameworks, httpclients, httpclients_2, rails, rest]
ruby-version: [2.4.10, 2.5.9, 2.6.10, 2.7.8, 3.0.6, 3.1.4, 3.2.2, 3.3.0]
steps:
- name: Configure git
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci_jruby.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
strategy:
fail-fast: false
matrix:
suite: [active_support_broadcast_logger, active_support_logger, activemerchant, agent_only, async_http, bare, deferred_instrumentation, grape, high_security, httpclient, httprb, httpx, json, logger, marshalling, rack, resque, roda, roda_agent_disabled, sequel, sinatra, sinatra_agent_disabled, stripe, thread, tilt, typhoeus]
suite: [active_support_broadcast_logger, active_support_logger, activemerchant, agent_only, async_http, bare, deferred_instrumentation, grape, high_security, httpclient, httprb, httpx, json, logger, marshalling, rack, resque, roda, roda_agent_disabled, ruby_openai, sequel, sinatra, sinatra_agent_disabled, stripe, thread, tilt, typhoeus]

steps:
- name: Configure git
Expand Down
9 changes: 9 additions & 0 deletions lib/new_relic/agent/configuration/default_source.rb
Original file line number Diff line number Diff line change
Expand Up @@ -1570,6 +1570,15 @@ def self.enforce_fallback(allowed_values: nil, fallback: nil)
:allowed_from_server => false,
:description => 'Controls auto-instrumentation of `Net::HTTP` at start-up. May be one of: `auto`, `prepend`, `chain`, `disabled`.'
},
:'instrumentation.ruby_openai' => {
:default => 'auto',
:documentation_default => 'auto',
:public => true,
:type => String,
:dynamic_name => true,
:allowed_from_server => false,
:description => 'Controls auto-instrumentation of the ruby-openai gem at start-up. May be one of: `auto`, `prepend`, `chain`, `disabled`.'
},
:'instrumentation.puma_rack' => {
:default => value_of(:'instrumentation.rack'),
:documentation_default => 'auto',
Expand Down
17 changes: 17 additions & 0 deletions lib/new_relic/agent/instrumentation/net_http/instrumentation.rb
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ module Agent
module Instrumentation
module NetHTTP
INSTRUMENTATION_NAME = NewRelic::Agent.base_name(name)
OPENAI_SEGMENT_PATTERN = %r{Llm/.*/OpenAI/.*}.freeze

def request_with_tracing(request)
NewRelic::Agent.record_instrumentation_invocation(INSTRUMENTATION_NAME)
Expand All @@ -32,12 +33,28 @@ def request_with_tracing(request)
end

wrapped_response = NewRelic::Agent::HTTPClients::NetHTTPResponse.new(response)

if openai_parent?(segment)
populate_openai_response_headers(wrapped_response, segment.parent)
end

segment.process_response_headers(wrapped_response)

response
ensure
segment&.finish
end
end

def openai_parent?(segment)
segment&.parent&.name&.match?(OPENAI_SEGMENT_PATTERN)
end

def populate_openai_response_headers(response, parent)
return unless parent.instance_variable_defined?(:@llm_event)

parent.llm_event.populate_openai_response_headers(response.to_hash)
end
end
end
end
Expand Down
37 changes: 37 additions & 0 deletions lib/new_relic/agent/instrumentation/ruby_openai.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# This file is distributed under New Relic's license terms.
# See https://github.com/newrelic/newrelic-ruby-agent/blob/main/LICENSE for complete details.
# frozen_string_literal: true

require_relative 'ruby_openai/instrumentation'
require_relative 'ruby_openai/chain'
require_relative 'ruby_openai/prepend'

DependencyDetection.defer do
named :'ruby_openai'

OPENAI_VERSION = Gem::Version.new(OpenAI::VERSION) if defined?(OpenAI)

depends_on do
# add a config check for ai_monitoring.enabled
# maybe add DT check here eventually?
defined?(OpenAI) && defined?(OpenAI::Client) &&
OPENAI_VERSION >= Gem::Version.new('3.4.0')
end

executes do
if use_prepend?
if OPENAI_VERSION >= Gem::Version.new('5.0.0')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool to see the support for older versions.

prepend_instrument OpenAI::Client,
NewRelic::Agent::Instrumentation::OpenAI::Prepend,
NewRelic::Agent::Instrumentation::OpenAI::VENDOR
else
prepend_instrument OpenAI::Client.singleton_class,
NewRelic::Agent::Instrumentation::OpenAI::Prepend,
NewRelic::Agent::Instrumentation::OpenAI::VENDOR
end
else
chain_instrument NewRelic::Agent::Instrumentation::OpenAI::Chain,
NewRelic::Agent::Instrumentation::OpenAI::VENDOR
end
end
end
36 changes: 36 additions & 0 deletions lib/new_relic/agent/instrumentation/ruby_openai/chain.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# This file is distributed under New Relic's license terms.
# See https://github.com/newrelic/newrelic-ruby-agent/blob/main/LICENSE for complete details.
# frozen_string_literal: true

module NewRelic::Agent::Instrumentation
module OpenAI::Chain
def self.instrument!
::OpenAI::Client.class_eval do
include NewRelic::Agent::Instrumentation::OpenAI

alias_method(:json_post_without_new_relic, :json_post)

# In versions 4.0.0+ json_post is an instance method
# defined in the OpenAI::HTTP module, included by the
# OpenAI::Client class
def json_post(**kwargs)
json_post_with_new_relic(**kwargs) do
json_post_without_new_relic(**kwargs)
end
end

# In versions below 4.0.0 json_post is a class method
# on OpenAI::Client
class << self
alias_method(:json_post_without_new_relic, :json_post)

def json_post(**kwargs)
json_post_with_new_relic(**kwargs) do
json_post_without_new_relic(**kwargs)
end
end
end
end
end
end
end
181 changes: 181 additions & 0 deletions lib/new_relic/agent/instrumentation/ruby_openai/instrumentation.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,181 @@
# This file is distributed under New Relic's license terms.
# See https://github.com/newrelic/newrelic-ruby-agent/blob/main/LICENSE for complete details.
# frozen_string_literal: true

module NewRelic::Agent::Instrumentation
module OpenAI
VENDOR = 'openAI' # AIM expects this capitalization style for the UI
INSTRUMENTATION_NAME = NewRelic::Agent.base_name(name)
EMBEDDINGS_PATH = '/embeddings'
CHAT_COMPLETIONS_PATH = '/chat/completions'
EMBEDDINGS_SEGMENT_NAME = 'Llm/embedding/OpenAI/embeddings'
CHAT_COMPLETIONS_SEGMENT_NAME = 'Llm/completion/OpenAI/chat'

def json_post_with_new_relic(path:, parameters:)
return yield unless path == EMBEDDINGS_PATH || path == CHAT_COMPLETIONS_PATH

NewRelic::Agent.record_instrumentation_invocation(INSTRUMENTATION_NAME)
NewRelic::Agent::Llm::LlmEvent.set_llm_agent_attribute_on_transaction

if path == EMBEDDINGS_PATH
embeddings_instrumentation(parameters) { yield }
elsif path == CHAT_COMPLETIONS_PATH
chat_completions_instrumentation(parameters) { yield }
end
end

private

def embeddings_instrumentation(parameters)
segment = NewRelic::Agent::Tracer.start_segment(name: EMBEDDINGS_SEGMENT_NAME)
record_openai_metric
event = create_embeddings_event(parameters)
segment.llm_event = event
begin
response = NewRelic::Agent::Tracer.capture_segment_error(segment) { yield }
# TODO: Remove !response.include?('error) when we drop support for versions below 4.0.0
add_embeddings_response_params(response, event) if response && !response.include?('error')

response
ensure
finish(segment, event)
end
end

def chat_completions_instrumentation(parameters)
segment = NewRelic::Agent::Tracer.start_segment(name: CHAT_COMPLETIONS_SEGMENT_NAME)
record_openai_metric
event = create_chat_completion_summary(parameters)
segment.llm_event = event
messages = create_chat_completion_messages(parameters, event.id)

begin
response = NewRelic::Agent::Tracer.capture_segment_error(segment) { yield }
# TODO: Remove !response.include?('error) when we drop support for versions below 4.0.0
if response && !response.include?('error')
add_chat_completion_response_params(parameters, response, event)
messages = update_chat_completion_messages(messages, response, event)
end

response
ensure
finish(segment, event)
messages&.each { |m| m.record }
end
end

def create_chat_completion_summary(parameters)
NewRelic::Agent::Llm::ChatCompletionSummary.new(
# TODO: POST-GA: Add metadata from add_custom_attributes if prefixed with 'llm.', except conversation_id
vendor: VENDOR,
conversation_id: conversation_id,
api_key_last_four_digits: parse_api_key,
request_max_tokens: parameters[:max_tokens] || parameters['max_tokens'],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What leads the hash to have symbols or strings for keys, and are there ever both types of key present in the same hash? It might be handy to have a helper method like this:

request_max_tokens: parameter_value(parameters, :max_tokens)

...

def parameter_value(parameters, value)
  parameters[value] || parameters[value.to_s]
end

and if we can rely on the hash keys being all symbols or all strings, we could further enhance the helper to memoize which key type is involved.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both Strings and Symbols are accepted as keys, so it's all up to the user on which they use. And yes—both types can be mix and matched in the same request.

Side note: We did performance testing of || vs kind_of? vs is_a? and found || to be the most performant by about 1.2x, which is why we went with an "or" check vs creating a helper method that would decide which to use.

request_model: parameters[:model] || parameters['model'],
temperature: parameters[:temperature] || parameters['temperature']
)
end

def create_embeddings_event(parameters)
NewRelic::Agent::Llm::Embedding.new(
# TODO: POST-GA: Add metadata from add_custom_attributes if prefixed with 'llm.', except conversation_id
vendor: VENDOR,
input: parameters[:input] || parameters['input'],
api_key_last_four_digits: parse_api_key,
request_model: parameters[:model] || parameters['model']
)
end

def add_chat_completion_response_params(parameters, response, event)
event.response_number_of_messages = (parameters[:messages] || parameters['messages']).size + response['choices'].size
# The response hash always returns keys as strings, so we don't need to run an || check here
event.response_model = response['model']
event.response_usage_total_tokens = response['usage']['total_tokens']
event.response_usage_prompt_tokens = response['usage']['prompt_tokens']
event.response_usage_completion_tokens = response['usage']['completion_tokens']
event.response_choices_finish_reason = response['choices'][0]['finish_reason']
end

def add_embeddings_response_params(response, event)
event.response_model = response['model']
event.response_usage_total_tokens = response['usage']['total_tokens']
event.response_usage_prompt_tokens = response['usage']['prompt_tokens']
end

def parse_api_key
'sk-' + headers['Authorization'][-4..-1]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm... it looks like the ability to use [-4..] instead of [-4..-1] wasn't introduced until Ruby v2.6.

end

# The customer must call add_custom_attributes with llm.conversation_id
# before the transaction starts. Otherwise, the conversation_id will be nil.
def conversation_id
return @nr_conversation_id if @nr_conversation_id

@nr_conversation_id ||= NewRelic::Agent::Tracer.current_transaction.attributes.custom_attributes[NewRelic::Agent::Llm::LlmEvent::CUSTOM_ATTRIBUTE_CONVERSATION_ID]
end

def create_chat_completion_messages(parameters, summary_id)
(parameters[:messages] || parameters['messages']).map.with_index do |message, index|
NewRelic::Agent::Llm::ChatCompletionMessage.new(
content: message[:content] || message['content'],
role: message[:role] || message['role'],
sequence: index,
completion_id: summary_id,
vendor: VENDOR,
is_response: true
)
end
end

def create_chat_completion_response_messages(response, sequence_origin, summary_id)
response['choices'].map.with_index(sequence_origin) do |choice, index|
NewRelic::Agent::Llm::ChatCompletionMessage.new(
content: choice['message']['content'],
role: choice['message']['role'],
sequence: index,
completion_id: summary_id,
vendor: VENDOR,
is_response: true
)
end
end

def update_chat_completion_messages(messages, response, summary)
messages += create_chat_completion_response_messages(response, messages.size, summary.id)
response_id = response['id'] || NewRelic::Agent::GuidGenerator.generate_guid

messages.each do |message|
# TODO: POST-GA: Add metadata from add_custom_attributes if prefixed with 'llm.', except conversation_id
message.id = "#{response_id}-#{message.sequence}"
message.conversation_id = conversation_id
message.request_id = summary.request_id
message.response_model = response['model']
end
end

def record_openai_metric
NewRelic::Agent.record_metric(nr_supportability_metric, 0.0)
end

def segment_noticed_error?(segment)
segment&.instance_variable_get(:@noticed_error)
end

def nr_supportability_metric
@nr_supportability_metric ||= "Supportability/Ruby/ML/OpenAI/#{::OpenAI::VERSION}"
end

def finish(segment, event)
segment&.finish

return unless event

if segment
event.error = true if segment_noticed_error?(segment)
event.duration = segment.duration
end

event.record
end
end
end
20 changes: 20 additions & 0 deletions lib/new_relic/agent/instrumentation/ruby_openai/prepend.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# This file is distributed under New Relic's license terms.
# See https://github.com/newrelic/newrelic-ruby-agent/blob/main/LICENSE for complete details.
# frozen_string_literal: true

module NewRelic::Agent::Instrumentation
module OpenAI::Prepend
include NewRelic::Agent::Instrumentation::OpenAI

# In versions 4.0.0+ json_post is an instance method defined in the
# OpenAI::HTTP module, included by the OpenAI::Client class.
#
# In versions below 4.0.0 json_post is a class method on OpenAI::Client.
#
# Dependency detection will apply the instrumentation to the correct scope,
# so we don't need to change the code here.
def json_post(**kwargs)
json_post_with_new_relic(**kwargs) { super }
end
end
end
25 changes: 22 additions & 3 deletions lib/new_relic/agent/llm/chat_completion_summary.rb
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,18 @@ class ChatCompletionSummary < LlmEvent
include ResponseHeaders

ATTRIBUTES = %i[api_key_last_four_digits request_max_tokens
response_number_of_messages request_model response_organization
response_usage_total_tokens response_usage_prompt_tokens
response_usage_completion_tokens response_choices_finish_reason
response_number_of_messages request_model response_usage_total_tokens response_usage_prompt_tokens response_usage_completion_tokens response_choices_finish_reason
request_temperature duration error]
ATTRIBUTE_NAME_EXCEPTIONS = {
response_number_of_messages: 'response.number_of_messages',
request_model: 'request.model',
response_usage_total_tokens: 'response.usage.total_tokens',
response_usage_prompt_tokens: 'response.usage.prompt_tokens',
response_usage_completion_tokens: 'response.usage.completion_tokens',
response_choices_finish_reason: 'response.choices.finish_reason',
temperature: 'request.temperature'
}

EVENT_NAME = 'LlmChatCompletionSummary'

attr_accessor(*ATTRIBUTES)
Expand All @@ -24,6 +32,17 @@ def attributes
LlmEvent::ATTRIBUTES + ChatCompletion::ATTRIBUTES + ResponseHeaders::ATTRIBUTES + ATTRIBUTES
end

def attribute_name_exceptions
# TODO: OLD RUBIES < 2.6
# Hash#merge accepts multiple arguments in 2.6
# Remove condition once support for Ruby <2.6 is dropped
if RUBY_VERSION >= '2.6.0'
LlmEvent::ATTRIBUTE_NAME_EXCEPTIONS.merge(ResponseHeaders::ATTRIBUTE_NAME_EXCEPTIONS, ATTRIBUTE_NAME_EXCEPTIONS)
else
LlmEvent::ATTRIBUTE_NAME_EXCEPTIONS.merge(ResponseHeaders::ATTRIBUTE_NAME_EXCEPTIONS).merge(ATTRIBUTE_NAME_EXCEPTIONS)
end
end

def event_name
EVENT_NAME
end
Expand Down
Loading
Loading