Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: implement LLM monitoring with langchainrb integration #2411

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

monotykamary
Copy link

@monotykamary monotykamary commented Sep 23, 2024

Description

This PR introduces a crude implementation of LLM Monitoring with LangChainrb integration to the Sentry Ruby SDK. The changes include:

  1. Addition of a new monitoring.rb file in the sentry-ruby/lib/sentry/ai/ directory, which implements AI monitoring functionality.
  2. Creation of a langchain.rb file in both sentry-ruby/lib/sentry/ and sentry-ruby/lib/sentry/ai/ directories, providing LangChain integration for the Sentry Ruby SDK.
  3. Potential updates to span.rb and transaction.rb to support these new features.

These changes enhance Sentry's capabilities in monitoring and integrating with AI-related technologies, particularly focusing on LangChain integration.

Current problems

  • Currently can't get it to show on the LLM Monitoring page, but most, if not all the span data are listed in the implementation.

Related Issues/PRs

Refactoring

  • No major refactoring was performed in this PR. All changes are related to new feature additions.

Changelog Entry

Added

  • Introduced AI monitoring capabilities (sentry-ruby/lib/sentry/ai/monitoring.rb)
  • Added LangChain integration (sentry-ruby/lib/sentry/langchain.rb and sentry-ruby/lib/sentry/ai/langchain.rb)
  • Enhanced span and transaction handling to support AI monitoring

Basic Testing:

require 'sentry-ruby'
require 'langchain'
require 'sentry/langchain'

puts "Initializing Sentry..."
Sentry.init do |config|
  config.dsn = ENV['SENTRY_DSN']
  config.traces_sample_rate = 1.0
  config.debug = true # Enable debug mode for more verbose logging
end

Sentry.with_scope do |scope|
  Sentry.set_tags(ai_operation: "Testing")
  
  transaction = Sentry.start_transaction(
    op: "ai.query",
    name: "AI Query Execution"
  )

  Sentry.configure_scope do |scope|
    scope.set_span(transaction)
  end

  begin
    Sentry::AI::Monitoring.ai_track("Testing")
    llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
    result = llm.chat(messages: [{role: "user", content: "testing input"}]).completion
    puts(result)
  rescue => e
    Sentry.capture_exception(e)
    raise e
  ensure
    transaction.finish
  end
end

@monotykamary
Copy link
Author

@sl0thentr0py I'm not too familiar Ruby or with how to get it completely up and running with LLM Monitoring, but I think have a good start. Can you have a look?
image

@sl0thentr0py
Copy link
Member

ty @monotykamary I will review this in a few days and see how best to package the new integration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants