Tracing and Session support on Integrations like LiteLLM Proxy and others and Is LangFuse logging a background process? #1536
-
Hey People, there isn't any clear documentation on how to add tracing and tags while using integrations, also will the call to langfuse handled asynchronsly? so the main return wont be blocked, i am using LiteLLM -> Langfuse -> Guardrails, so this is causing delay as generation is faster but next 2 steps are slow, i know the Guardrails will take time especially when language models are involved, but just wanted to confirm that does langfuse logs as a background process |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Langfuse works fully async on a separate background thread to send events to the API.
The LiteLLM docs are quite extensive on this: https://litellm.vercel.app/docs/observability/langfuse_integration |
Beta Was this translation helpful? Give feedback.
Langfuse works fully async on a separate background thread to send events to the API.
The LiteLLM docs are quite extensive on this: https://litellm.vercel.app/docs/observability/langfuse_integration