-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: adds langfuse callback handler #11324
Merged
logan-markewich
merged 8 commits into
run-llama:main
from
hassiebp:feat-adds-langfuse-callback-handler
Mar 1, 2024
Merged
Changes from 1 commit
Commits
Show all changes
8 commits
Select commit
Hold shift + click to select a range
0de601a
feat: adds langfuse callback handler
hassiebp 4646822
feat: adds langfuse demo gif
hassiebp 8290887
feat: passes sdk_integration to llama-index callback handler
hassiebp 7fb5d93
bumps langfuse version
hassiebp 2e4e202
adds readme and link to example notebook
hassiebp 74565c5
lint changes
hassiebp 8d5988c
Merge branch 'main' into feat-adds-langfuse-callback-handler
hassiebp ba73637
updates mappings
hassiebp File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,280 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"attachments": {}, | ||
"cell_type": "markdown", | ||
"id": "d6509c3a", | ||
"metadata": {}, | ||
"source": [ | ||
"<a href=\"https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/callbacks/LangfuseCallbackHandler.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "c0d8b66c", | ||
"metadata": {}, | ||
"source": [ | ||
"# Langfuse Callback Handler\n", | ||
"\n", | ||
"[Langfuse](https://langfuse.com/docs) is an open source LLM engineering platform to help teams collaboratively debug, analyze and iterate on their LLM Applications.\n", | ||
"\n", | ||
"The `LangfuseCallbackHandler` is integrated with Langfuse and empowers you to seamlessly track and monitor performance, traces, and metrics of your LlamaIndex application. Detailed traces of the LlamaIndex context augmentation and the LLM querying processes are captured and can be inspected directly in the Langfuse UI." | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "3b9057da", | ||
"metadata": {}, | ||
"source": [ | ||
"## Setup" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "5d9dfc7f", | ||
"metadata": {}, | ||
"source": [ | ||
"### Install packages" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "49c3527e", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# %pip install llama-index llama-index-callbacks-langfuse\n", | ||
"%pip install llama-index langfuse" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "bc10630b", | ||
"metadata": {}, | ||
"source": [ | ||
"### Configure environment" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "4c256817", | ||
"metadata": {}, | ||
"source": [ | ||
"If you haven't done yet, [sign up on Langfuse](https://cloud.langfuse.com/auth/sign-up) and obtain your API keys from the project settings." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "787e836d", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import os\n", | ||
"\n", | ||
"# Langfuse \n", | ||
"os.environ[\"LANGFUSE_SECRET_KEY\"] = \"sk-lf-...\"\n", | ||
"os.environ[\"LANGFUSE_PUBLIC_KEY\"] = \"pk-lf-...\"\n", | ||
"os.environ[\"LANGFUSE_HOST\"] = \"https://cloud.langfuse.com\" # 🇪🇺 EU region, 🇺🇸 US region: \"https://us.cloud.langfuse.com\"\n", | ||
"\n", | ||
"# OpenAI\n", | ||
"os.environ[\"OPENAI_API_KEY\"] = \"sk-...\"" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "1fe2ba01", | ||
"metadata": {}, | ||
"source": [ | ||
"### Register the Langfuse callback handler" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "cfef9ddc", | ||
"metadata": {}, | ||
"source": [ | ||
"#### Option 1: Set global LlamaIndex handler" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "72afb2b9", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from llama_index.core import set_global_handler\n", | ||
"\n", | ||
"set_global_handler(\"langfuse\")\n", | ||
"langfuse_callback_handler = llama_index.core.global_handler" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "0e6557d2", | ||
"metadata": {}, | ||
"source": [ | ||
"#### Option 2: Use Langfuse callback directly" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "4bdd95bf", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from llama_index.core import Settings\n", | ||
"from llama_index.core.callbacks import CallbackManager\n", | ||
"from langfuse.llama_index import LlamaIndexCallbackHandler\n", | ||
" \n", | ||
"langfuse_callback_handler = LlamaIndexCallbackHandler()\n", | ||
"Settings.callback_manager = CallbackManager([langfuse_callback_handler])" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "e3e03ce7", | ||
"metadata": {}, | ||
"source": [ | ||
"### Flush events to Langfuse" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "e2c811ec", | ||
"metadata": {}, | ||
"source": [ | ||
"The Langfuse SDKs queue and batches events in the background to reduce the number of network requests and improve overall performance. Before exiting your application, make sure all queued events have been flushed to Langfuse servers." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "4e28876c", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# ... your LlamaIndex calls here ...\n", | ||
"\n", | ||
"langfuse_callback_handler.flush()" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "6b86f1b5", | ||
"metadata": {}, | ||
"source": [ | ||
"Done!✨ Traces and metrics from your LlamaIndex application are now automatically tracked in Langfuse. If you construct a new index or query an LLM with your documents in context, your traces and metrics are immediately visible in the Langfuse UI. Next, let's take a look at how traces will look in Langfuse." | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "1f0d4465", | ||
"metadata": {}, | ||
"source": [ | ||
"## Example" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "8a9f3428", | ||
"metadata": {}, | ||
"source": [ | ||
"Fetch and save example data." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "aa303ae3", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"!mkdir -p 'data/'\n", | ||
"!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham_essay.txt'" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "9f053996", | ||
"metadata": {}, | ||
"source": [ | ||
"Run an example index construction, query, and chat." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "983cbedd", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from llama_index.core import SimpleDirectoryReader, VectorStoreIndex\n", | ||
"\n", | ||
"# Create index\n", | ||
"documents = SimpleDirectoryReader('data').load_data()\n", | ||
"index = VectorStoreIndex.from_documents(documents)\n", | ||
"\n", | ||
"# Execute query\n", | ||
"query_engine = index.as_query_engine()\n", | ||
"query_response = query_engine.query(\"What did the author do growing up?\")\n", | ||
"print(query_response)\n", | ||
"\n", | ||
"# Execute chat query\n", | ||
"chat_engine = index.as_chat_engine()\n", | ||
"chat_response = chat_engine.chat(\"What did the author do growing up?\")\n", | ||
"print(chat_response)\n", | ||
"\n", | ||
"# As we want to immediately see result in Langfuse, we need to flush the callback handler\n", | ||
"langfuse_callback_handler.flush()" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "d5cdd88f", | ||
"metadata": {}, | ||
"source": [ | ||
"Done!✨ You will now see traces of your index and query in your Langfuse project.\n", | ||
"\n", | ||
"Example traces (public links):\n", | ||
"1. [Index construction](https://cloud.langfuse.com/project/clsuh9o2y0000mbztvdptt1mh/traces/1294ed01-8193-40a5-bb4e-2f0723d2c827)\n", | ||
"2. [Query Engine](https://cloud.langfuse.com/project/clsuh9o2y0000mbztvdptt1mh/traces/eaa4ea74-78e0-42ef-ace0-7aa02c6fbbc6)\n", | ||
"3. [Chat Engine](https://cloud.langfuse.com/project/clsuh9o2y0000mbztvdptt1mh/traces/d95914f5-66eb-4520-b996-49e84fd7f323)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "0b50845f", | ||
"metadata": {}, | ||
"source": [ | ||
"### 📚 More details\n", | ||
"\n", | ||
"Check out the full [Langfuse documentation](https://langfuse.com/docs) for more details on Langfuse's tracing and analytics capabilities and how to make most of this integration." | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3 (ipykernel)", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.9.6" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 5 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
3 changes: 3 additions & 0 deletions
3
llama-index-integrations/callbacks/llama-index-callbacks-langfuse/BUILD
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
poetry_requirements( | ||
name="poetry", | ||
) |
17 changes: 17 additions & 0 deletions
17
llama-index-integrations/callbacks/llama-index-callbacks-langfuse/Makefile
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,17 @@ | ||
GIT_ROOT ?= $(shell git rev-parse --show-toplevel) | ||
|
||
help: ## Show all Makefile targets. | ||
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[33m%-30s\033[0m %s\n", $$1, $$2}' | ||
|
||
format: ## Run code autoformatters (black). | ||
pre-commit install | ||
git ls-files | xargs pre-commit run black --files | ||
|
||
lint: ## Run linters: pre-commit (black, ruff, codespell) and mypy | ||
pre-commit install && git ls-files | xargs pre-commit run --show-diff-on-failure --files | ||
|
||
test: ## Run tests via pytest. | ||
pytest tests | ||
|
||
watch-docs: ## Build and watch documentation. | ||
sphinx-autobuild docs/ docs/_build/html --open-browser --watch $(GIT_ROOT)/llama_index/ |
1 change: 1 addition & 0 deletions
1
llama-index-integrations/callbacks/llama-index-callbacks-langfuse/README.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
# LlamaIndex Callbacks Integration: Langfuse |
1 change: 1 addition & 0 deletions
1
...ntegrations/callbacks/llama-index-callbacks-langfuse/llama_index/callbacks/langfuse/BUILD
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
python_sources() |
3 changes: 3 additions & 0 deletions
3
...tions/callbacks/llama-index-callbacks-langfuse/llama_index/callbacks/langfuse/__init__.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
from llama_index.callbacks.langfuse.base import langfuse_callback_handler | ||
|
||
__all__ = ["langfuse_callback_handler"] |
9 changes: 9 additions & 0 deletions
9
...egrations/callbacks/llama-index-callbacks-langfuse/llama_index/callbacks/langfuse/base.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,9 @@ | ||
from typing import Any | ||
|
||
from llama_index.core.callbacks.base_handler import BaseCallbackHandler | ||
|
||
from langfuse.llama_index import LlamaIndexCallbackHandler | ||
|
||
|
||
def langfuse_callback_handler(**eval_params: Any) -> BaseCallbackHandler: | ||
return LlamaIndexCallbackHandler(**eval_params) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since we are here, can fill out the readme a bit? tbh it can just be copy paste from the docs (this will show up on llamahub)