Langfuse回调处理器¶
⚠️ 此集成已弃用。我们建议使用基于仪器化的新集成方案,如此处所述与Langfuse的集成。
本教程向您展示如何使用Langfuse回调处理器来监控LlamaIndex应用程序。
什么是Langfuse?¶
Langfuse 是一个开源的LLM工程平台,旨在帮助团队协作调试、分析和迭代他们的LLM应用程序。Langfuse为自动捕获LlamaIndex应用中生成的追踪数据和指标提供了简单的集成方案。
它是如何工作的?¶
LangfuseCallbackHandler 与 Langfuse 集成,使您能够无缝跟踪和监控 LlamaIndex 应用程序的性能、追踪和指标。LlamaIndex 上下文增强和 LLM 查询过程的详细追踪会被捕获,并可以直接在 Langfuse 用户界面中查看。

设置¶
安装包¶
In [ ]:
Copied!
%pip install llama-index llama-index-callbacks-langfuse
%pip install llama-index llama-index-callbacks-langfuse
配置环境¶
如果尚未完成,在Langfuse上注册并从项目设置中获取您的API密钥。
In [ ]:
Copied!
import os
# Get keys for your project from the project settings page https://cloud.langfuse.com
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
# OpenAI
os.environ["OPENAI_API_KEY"] = "sk-..."
import os
# 从项目设置页面获取您的项目密钥 https://cloud.langfuse.com
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 欧盟区域
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 美国区域
# OpenAI
os.environ["OPENAI_API_KEY"] = "sk-..."
注册Langfuse回调处理器¶
选项1:设置全局LlamaIndex处理器¶
In [ ]:
Copied!
from llama_index.core import global_handler, set_global_handler
set_global_handler("langfuse")
langfuse_callback_handler = global_handler
从llama_index.core导入global_handler, set_global_handler
set_global_handler("langfuse")
langfuse_callback_handler = global_handler
选项2:直接使用Langfuse回调¶
In [ ]:
Copied!
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])
将事件刷新到Langfuse¶
Langfuse SDK在后台对事件进行排队和批量处理,以减少网络请求次数并提升整体性能。在退出应用程序之前,请确保所有排队的事件都已刷新到Langfuse服务器。
In [ ]:
Copied!
# ... your LlamaIndex calls here ...
langfuse_callback_handler.flush()
# ... 你的LlamaIndex调用代码放在这里 ...
langfuse_callback_handler.flush()
完成!✨ 您的LlamaIndex应用中的追踪数据和指标现已自动记录到Langfuse中。当您构建新索引或结合上下文文档查询LLM时,相关追踪数据和指标会立即显示在Langfuse用户界面中。接下来,让我们看看这些追踪数据在Langfuse中的呈现方式。
示例¶
获取并保存示例数据。
In [ ]:
Copied!
!mkdir -p 'data/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham_essay.txt'
!mkdir -p 'data/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham_essay.txt'
运行一个示例索引构建、查询和聊天。
In [ ]:
Copied!
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
# Create index
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
# Execute query
query_engine = index.as_query_engine()
query_response = query_engine.query("What did the author do growing up?")
print(query_response)
# Execute chat query
chat_engine = index.as_chat_engine()
chat_response = chat_engine.chat("What did the author do growing up?")
print(chat_response)
# As we want to immediately see result in Langfuse, we need to flush the callback handler
langfuse_callback_handler.flush()
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
# 创建索引
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
# 执行查询
query_engine = index.as_query_engine()
query_response = query_engine.query("作者在成长过程中做了什么?")
print(query_response)
# 执行聊天查询
chat_engine = index.as_chat_engine()
chat_response = chat_engine.chat("作者在成长过程中做了什么?")
print(chat_response)
# 为了立即在Langfuse中看到结果,我们需要刷新回调处理器
langfuse_callback_handler.flush()