如何为您的图添加线程级持久性¶
许多AI应用程序需要内存以在多个交互之间共享上下文。在LangGraph中,这种内存可以通过在任何 StateGraph 中添加 线程级持久性 来实现。
在创建任何LangGraph图时,您可以通过在编译图时添加 检查点保存器 来设置其状态保持:
from langgraph.checkpoint.memory import MemorySaver
checkpointer = MemorySaver()
graph.compile(checkpointer=checkpointer)
本指南展示了如何为您的图添加线程级持久性。
注意
如果您需要在多个对话或用户之间共享的内存(跨线程持久性),请查看这个 操作指南。
设置¶
首先,我们需要安装所需的包
接下来,我们需要为 OpenAI(我们将使用的 LLM)和 Tavily(我们将使用的搜索工具)设置 API 密钥。
import getpass
import os
def _set_env(var: str):
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"{var}: ")
_set_env("ANTHROPIC_API_KEY")
为LangGraph开发设置LangSmith
注册LangSmith以快速发现问题并提升你的LangGraph项目的性能。LangSmith让你使用跟踪数据来调试、测试和监控你使用LangGraph构建的LLM应用—了解更多关于如何开始的信息请点击这里。
定义图¶
我们将使用一个单节点图,该图调用一个聊天模型。
首先,让我们定义将要使用的模型:
from langchain_anthropic import ChatAnthropic
model = ChatAnthropic(model="claude-3-5-sonnet-20240620")
API Reference:
ChatAnthropic
现在我们可以定义我们的 StateGraph 并添加我们的模型调用节点:
from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph import StateGraph, MessagesState, START
def call_model(state: MessagesState):
response = model.invoke(state["messages"])
return {"messages": response}
builder = StateGraph(MessagesState)
builder.add_node("call_model", call_model)
builder.add_edge(START, "call_model")
graph = builder.compile()
API Reference:
StateGraph | START
如果我们尝试使用这个图表,谈话的上下文将不会在交互中保持:
input_message = {"type": "user", "content": "hi! I'm bob"}
for chunk in graph.stream({"messages": [input_message]}, stream_mode="values"):
chunk["messages"][-1].pretty_print()
input_message = {"type": "user", "content": "what's my name?"}
for chunk in graph.stream({"messages": [input_message]}, stream_mode="values"):
chunk["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
hi! I'm bob
==================================[1m Ai Message [0m==================================
Hello Bob! It's nice to meet you. How are you doing today? Is there anything I can help you with or would you like to chat about something in particular?
================================[1m Human Message [0m=================================
what's my name?
==================================[1m Ai Message [0m==================================
I apologize, but I don't have access to your personal information, including your name. I'm an AI language model designed to provide general information and answer questions to the best of my ability based on my training data. I don't have any information about individual users or their personal details. If you'd like to share your name, you're welcome to do so, but I won't be able to recall it in future conversations.
添加持久性¶
为了添加持久性,我们需要在编译图时传入一个 Checkpointer。
from langgraph.checkpoint.memory import MemorySaver
memory = MemorySaver()
graph = builder.compile(checkpointer=memory)
# If you're using LangGraph Cloud or LangGraph Studio, you don't need to pass the checkpointer when compiling the graph, since it's done automatically.
API Reference:
MemorySaver
注意
如果您正在使用 LangGraph Cloud 或 LangGraph Studio,您不需要在编译图时传递检查点,因为这是自动完成的。
我们现在可以与代理进行互动,看看它是否记得之前的信息!
config = {"configurable": {"thread_id": "1"}}
input_message = {"type": "user", "content": "hi! I'm bob"}
for chunk in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
chunk["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
hi! I'm bob
==================================[1m Ai Message [0m==================================
Hello Bob! It's nice to meet you. How are you doing today? Is there anything in particular you'd like to chat about or any questions you have that I can help you with?
input_message = {"type": "user", "content": "what's my name?"}
for chunk in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
chunk["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
what's my name?
==================================[1m Ai Message [0m==================================
Your name is Bob, as you introduced yourself at the beginning of our conversation.
thread_id。喧嚣!所有的记忆都消失了!
input_message = {"type": "user", "content": "what's my name?"}
for chunk in graph.stream(
{"messages": [input_message]},
{"configurable": {"thread_id": "2"}},
stream_mode="values",
):
chunk["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
what's is my name?
==================================[1m Ai Message [0m==================================
I apologize, but I don't have access to your personal information, including your name. As an AI language model, I don't have any information about individual users unless it's provided within the conversation. If you'd like to share your name, you're welcome to do so, but otherwise, I won't be able to know or guess it.