Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI: Update response headers #2507

Merged
merged 5 commits into from
Mar 14, 2024
Merged

Conversation

kaylareopelle
Copy link
Contributor

@kaylareopelle kaylareopelle commented Mar 13, 2024

LlmEmbedding and LlmChatCompletionSummary events generated by OpenAI have attributes pulled from response headers.

The following response headers should be captured if they're available:

  • response.headers.ratelimitLimitTokensUsageBased
  • response.headers.ratelimitResetTokensUsageBased
  • response.headers.ratelimitRemainingTokensUsageBased

We haven't been able to craft a request that returns these headers.

In addition, this PR updates the data types of the response headers to match the spec (internal).

Closes #2461

The following response headers should be captured if they're available:
* response.headers.ratelimitLimitTokensUsageBased
* response.headers.ratelimitResetTokensUsageBased
* response.headers.ratelimitRemainingTokensUsageBased

We haven't been able to craft a request that returns these headers.
Comment on lines +55 to +57
remaining_headers(headers)
reset_headers(headers)
tokens_usage_based_headers(headers)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The branch condition for the method was getting to be too high, so I grouped headers with similar names into their own methods

Some response headers are expected to be Integers.
Others are expected to be strings. Now, they match the spec.
@kaylareopelle kaylareopelle changed the title Add "tokens usage based" response headers Update response headers Mar 13, 2024
@kaylareopelle kaylareopelle marked this pull request as ready for review March 13, 2024 22:49
@kaylareopelle kaylareopelle linked an issue Mar 13, 2024 that may be closed by this pull request
@kaylareopelle kaylareopelle changed the title Update response headers OpenAI: Update response headers Mar 13, 2024
lib/new_relic/agent/llm/response_headers.rb Outdated Show resolved Hide resolved
lib/new_relic/agent/llm/response_headers.rb Outdated Show resolved Hide resolved
lib/new_relic/agent/llm/response_headers.rb Outdated Show resolved Hide resolved
This brings the instance variable names into closer alignment
with the constant and the final attribute name.

Also, update rate_limit_requests to ratelimit_limit_requests
and rate_limit_tokens to ratelimit_limit_tokens to be closer to the
header name
Some of the headers had fake values that were confusing in the context.
Now, it's easier to tell why some attributes are strings
and others are integers.
#first cannot be operated on nil. It's safe to assume that
when a value reaches #to_i, it will not be nil.

If it was to be nil, we could have some false data sent,
as nil.to_i returns zero, which would be recorded by
the custom event api.
Copy link
Contributor

SimpleCov Report

Coverage Threshold
Line 93.72% 93%
Branch 71.19% 71%

@kaylareopelle kaylareopelle merged commit 8af2b3e into openai_instrumentation Mar 14, 2024
28 checks passed
@kaylareopelle kaylareopelle deleted the response_headers branch March 14, 2024 22:44
@kaylareopelle kaylareopelle restored the response_headers branch March 15, 2024 00:04
@kaylareopelle kaylareopelle deleted the response_headers branch October 9, 2024 19:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

OpenAI: Verify response headers
4 participants