-
Notifications
You must be signed in to change notification settings - Fork 444
chore(llmobs): dac strip io from anthropic #13767
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Bootstrap import analysisComparison of import times between this PR and base. SummaryThe average import time from this PR is: 277 ± 2 ms. The average import time from base is: 278 ± 2 ms. The import time difference between this PR and base is: -1.53 ± 0.1 ms. Import time breakdownThe following import paths have shrunk:
|
BenchmarksBenchmark execution time: 2025-06-27 17:35:48 Comparing candidate commit a6f1f47 in PR branch Found 0 performance improvements and 1 performance regressions! Performance is the same for 568 metrics, 3 unstable metrics. scenario:iastaspects-lstrip_aspect
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a few comments, but glad the integration code was trimmed down so much! Looks great!
@@ -155,7 +154,7 @@ def _process_finished_stream(integration, span, args, kwargs, streamed_chunks): | |||
try: | |||
resp_message = _construct_message(streamed_chunks) | |||
if integration.is_pc_sampled_span(span): | |||
_tag_streamed_chat_completion_response(integration, span, resp_message) | |||
_tag_streamed_chat_completion_usage(integration, span, resp_message) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this depend on integration.is_pc_sampled_span
?
@@ -10,30 +10,19 @@ | |||
"error": 0, | |||
"meta": { | |||
"_dd.p.dm": "-0", | |||
"_dd.p.tid": "665f5f5200000000", | |||
"_dd.p.tid": "685c01fa00000000", | |||
"anthropic.request.api_key": "sk-...key>", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we still need API key on the APM span?
metrics[OUTPUT_TOKENS_METRIC_KEY] = output_tokens | ||
total_tokens = total_tokens + output_tokens if total_tokens else output_tokens | ||
if total_tokens is not None: | ||
metrics[TOTAL_TOKENS_METRIC_KEY] = total_tokens | ||
span._set_ctx_item(METRICS, metrics) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we probably should be consistent with the old way of calculating the total tokens (e.g. we only set total tokens if both input and output tokens are set). I think in your new implementation, we could end up with total_tokens being just input_tokens or output_tokens.
Remove potentially sensitive i/o data from apm spans. This way, prompt and completion data will only appear on the llm obs spans, which are/will be subject to data access controls.
Mostly, this just removes io tag sets. A few things (mostly metrics) have llmobs tags dependent on span tags, so there is a bit more refactoring there.
Let me know if I removed anything that should really stay, or if I missed something that should be restricted.
Checklist
Reviewer Checklist