Skip to content

如何流式传输图的状态更新

LangGraph 支持多种流式模式。主要有以下两种:

  • values:该流式模式返回图的值。这是每个节点被调用后图的 完整状态
  • updates:该流式模式返回图的更新。这是每个节点被调用后图的 状态更新

本指南涵盖 stream_mode="updates"

设置

首先,安装所需的包并设置我们的 API 密钥。

%%capture --no-stderr
%pip install -U langgraph langchain-openai langchain-community
import getpass
import os


def _set_env(var: str):
    if not os.environ.get(var):
        os.environ[var] = getpass.getpass(f"{var}: ")


_set_env("OPENAI_API_KEY")

为 LangGraph 开发设置 LangSmith

注册 LangSmith,以快速发现问题并改善您的 LangGraph 项目的性能。LangSmith 让您利用跟踪数据来调试、测试和监控您使用 LangGraph 构建的 LLM 应用程序 — 了解如何开始 请点击这里

定义图

我们将使用一个简单的 ReAct 代理来进行本指南。

from typing import Literal
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain_core.runnables import ConfigurableField
from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent


@tool
def get_weather(city: Literal["nyc", "sf"]):
    """使用此来获取天气信息。"""
    if city == "nyc":
        return "It might be cloudy in nyc"
    elif city == "sf":
        return "It's always sunny in sf"
    else:
        raise AssertionError("Unknown city")


tools = [get_weather]

model = ChatOpenAI(model_name="gpt-4o", temperature=0)
graph = create_react_agent(model, tools)

流更新

inputs = {"messages": [("human", "what's the weather in sf")]}
async for chunk in graph.astream(inputs, stream_mode="updates"):
    for node, values in chunk.items():
        print(f"Receiving update from node: '{node}'")
        print(values)
        print("\n\n")
Receiving update from node: 'agent'
{'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_kc6cvcEkTAUGRlSHrP4PK9fn', 'function': {'arguments': '{"city":"sf"}', 'name': 'get_weather'}, 'type': 'function'}]}, response_metadata={'token_usage': {'completion_tokens': 14, 'prompt_tokens': 57, 'total_tokens': 71}, 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_3e7d703517', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-cd68b3a0-86c3-4afa-9649-1b962a0dd062-0', tool_calls=[{'name': 'get_weather', 'args': {'city': 'sf'}, 'id': 'call_kc6cvcEkTAUGRlSHrP4PK9fn'}], usage_metadata={'input_tokens': 57, 'output_tokens': 14, 'total_tokens': 71})]}



Receiving update from node: 'tools'
{'messages': [ToolMessage(content="It's always sunny in sf", name='get_weather', tool_call_id='call_kc6cvcEkTAUGRlSHrP4PK9fn')]}



Receiving update from node: 'agent'
{'messages': [AIMessage(content='The weather in San Francisco is currently sunny.', response_metadata={'token_usage': {'completion_tokens': 10, 'prompt_tokens': 84, 'total_tokens': 94}, 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_3e7d703517', 'finish_reason': 'stop', 'logprobs': None}, id='run-009d83c4-b874-4acc-9494-20aba43132b9-0', usage_metadata={'input_tokens': 84, 'output_tokens': 10, 'total_tokens': 94})]}

优云智算