-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
summary: AWS Bedrock instrumentation #2314
Conversation
* First pass at Bedrock integration tests * Response payloads do not have a model ID * Fixed issue with errors getting dropped; Fixed Titan Embedded response model * Adding custom attributes test * Test fixes and cleanup * Added AwsBedrockConfiguration * Fix issue with embedding event * Fixing intermittent test failures * Add new integration tests to workflows * Reduce test timeout
src/Agent/NewRelic/Agent/Extensions/NewRelic.Agent.Extensions/Api/IAgentApi.cs
Outdated
Show resolved
Hide resolved
src/Agent/NewRelic/Agent/Extensions/NewRelic.Agent.Extensions/Llm/EventHelper.cs
Show resolved
Hide resolved
src/Agent/NewRelic/Agent/Extensions/Providers/Wrapper/Bedrock/Bedrock.csproj
Outdated
Show resolved
Hide resolved
src/Agent/NewRelic/Agent/Extensions/Providers/Wrapper/Bedrock/InvokeModelAsyncWrapper.cs
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just had this one question which I think Chris V. already asked as well.
src/Agent/NewRelic/Agent/Extensions/Providers/Wrapper/Bedrock/InvokeModelAsyncWrapper.cs
Outdated
Show resolved
Hide resolved
src/Agent/NewRelic/Agent/Extensions/Providers/Wrapper/Bedrock/InvokeModelAsyncWrapper.cs
Show resolved
Hide resolved
* Move Json deserialization from wrapper to core * Also needed to move the models to core as well * Add and tweak comments/readme
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2314 +/- ##
==========================================
- Coverage 80.53% 80.47% -0.06%
==========================================
Files 441 450 +9
Lines 27631 27942 +311
Branches 2989 3023 +34
==========================================
+ Hits 22252 22487 +235
- Misses 4611 4682 +71
- Partials 768 773 +5
Flags with carried forward coverage won't be shown. Click here to find out more.
|
feat: Add auto-instrumentation for AWS Bedrock
feat: New configuration options are available specific to AI monitoring.
feat: A new AI monitoring related public API method has been added - SetLlmTokenCountingCallback
notice: New Relic AI monitoring is the industry’s first APM solution that provides end-to-end visibility for AI Large Language Model (LLM) applications. It enables end-to-end visibility into the key components of an AI LLM application. With AI monitoring, users can monitor, alert, and debug AI-powered applications for reliability, latency, performance, security and cost. AI monitoring also enables AI/LLM specific insights (metrics, events, logs and traces) which can easily integrate to build advanced guardrails for enterprise security, privacy and compliance.
notice: AI monitoring offers custom-built insights and tracing for the complete lifecycle of an LLM’s prompts and responses, from raw user input to repaired/polished responses. AI monitoring provides built-in integrations with popular LLMs and components of the AI development stack. This release provides instrumentation for AWS Bedrock.
notice: When AI monitoring is enabled, the agent will now capture AI LLM related data. This data will be visible under a new APM tab called AI Responses. See our AI Monitoring documentation for more details.