如何流式传输图的完整状态¶
LangGraph 支持多种流式传输模式。主要的有
values
:此流式传输模式流式传输图的值。这是在每次调用节点后,图的完整状态。updates
:此流式传输模式流式传输图的更新。这是在每次调用节点后,图的状态更新。
本指南涵盖 streamMode="values"
。
定义状态¶
状态是我们图中所有节点的接口。
import { Annotation } from "@langchain/langgraph";
import { BaseMessage } from "@langchain/core/messages";
const StateAnnotation = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
});
设置工具¶
我们将首先定义我们要使用的工具。对于这个简单的示例,我们将创建一个占位符搜索引擎。但是,创建您自己的工具非常容易 - 请参阅此处的文档,了解如何操作。
import { tool } from "@langchain/core/tools";
import { z } from "zod";
const searchTool = tool(async ({ query: _query }: { query: string }) => {
// This is a placeholder for the actual implementation
return "Cold, with a low of 3℃";
}, {
name: "search",
description:
"Use to surf the web, fetch current information, check the weather, and retrieve other information.",
schema: z.object({
query: z.string().describe("The query to use in your search."),
}),
});
await searchTool.invoke({ query: "What's the weather like?" });
const tools = [searchTool];
我们现在可以将这些工具包装在一个简单的 ToolNode 中。此对象将在我们的 LLM 调用它们时实际运行工具(函数)。
设置模型¶
现在我们将加载 聊天模型。
- 它应该与消息一起工作。我们将以消息的形式表示所有代理状态,因此它需要能够很好地与它们一起工作。
- 它应该与 工具调用 一起工作,这意味着它可以返回其响应中的函数参数。
注意
这些模型要求不是使用 LangGraph 的通用要求 - 它们只是此示例的要求。
完成此操作后,我们应确保模型知道它可以调用这些工具。我们可以通过调用 bindTools 来做到这一点。
定义图¶
我们现在可以将所有内容放在一起。
import { END, START, StateGraph } from "@langchain/langgraph";
import { AIMessage } from "@langchain/core/messages";
const routeMessage = (state: typeof StateAnnotation.State) => {
const { messages } = state;
const lastMessage = messages[messages.length - 1] as AIMessage;
// If no tools are called, we can finish (respond to the user)
if (!lastMessage?.tool_calls?.length) {
return END;
}
// Otherwise if there is, we continue and call the tools
return "tools";
};
const callModel = async (
state: typeof StateAnnotation.State,
) => {
// For versions of @langchain/core < 0.2.3, you must call `.stream()`
// and aggregate the message from chunks instead of calling `.invoke()`.
const { messages } = state;
const responseMessage = await boundModel.invoke(messages);
return { messages: [responseMessage] };
};
const workflow = new StateGraph(StateAnnotation)
.addNode("agent", callModel)
.addNode("tools", toolNode)
.addEdge(START, "agent")
.addConditionalEdges("agent", routeMessage)
.addEdge("tools", "agent");
const graph = workflow.compile();
流式传输值¶
我们现在可以与代理交互。在交互之间,您可以获取和更新状态。
let inputs = { messages: [{ role: "user", content: "what's the weather in sf" }] };
for await (
const chunk of await graph.stream(inputs, {
streamMode: "values",
})
) {
console.log(chunk["messages"]);
console.log("\n====\n");
}
[ [ 'user', "what's the weather in sf" ] ]
====
[
[ 'user', "what's the weather in sf" ],
AIMessage {
"id": "chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU",
"content": "",
"additional_kwargs": {
"tool_calls": [
{
"id": "call_iD5Wk4vPsTckffDKJpEQaMkg",
"type": "function",
"function": "[Object]"
}
]
},
"response_metadata": {
"tokenUsage": {
"completionTokens": 17,
"promptTokens": 70,
"totalTokens": 87
},
"finish_reason": "tool_calls",
"system_fingerprint": "fp_3aa7262c27"
},
"tool_calls": [
{
"name": "search",
"args": {
"query": "current weather in San Francisco"
},
"type": "tool_call",
"id": "call_iD5Wk4vPsTckffDKJpEQaMkg"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 70,
"output_tokens": 17,
"total_tokens": 87
}
}
]
====
[
[ 'user', "what's the weather in sf" ],
AIMessage {
"id": "chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU",
"content": "",
"additional_kwargs": {
"tool_calls": [
{
"id": "call_iD5Wk4vPsTckffDKJpEQaMkg",
"type": "function",
"function": "[Object]"
}
]
},
"response_metadata": {
"tokenUsage": {
"completionTokens": 17,
"promptTokens": 70,
"totalTokens": 87
},
"finish_reason": "tool_calls",
"system_fingerprint": "fp_3aa7262c27"
},
"tool_calls": [
{
"name": "search",
"args": {
"query": "current weather in San Francisco"
},
"type": "tool_call",
"id": "call_iD5Wk4vPsTckffDKJpEQaMkg"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 70,
"output_tokens": 17,
"total_tokens": 87
}
},
ToolMessage {
"content": "Cold, with a low of 3℃",
"name": "search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "call_iD5Wk4vPsTckffDKJpEQaMkg"
}
]
====
[
[ 'user', "what's the weather in sf" ],
AIMessage {
"id": "chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU",
"content": "",
"additional_kwargs": {
"tool_calls": [
{
"id": "call_iD5Wk4vPsTckffDKJpEQaMkg",
"type": "function",
"function": "[Object]"
}
]
},
"response_metadata": {
"tokenUsage": {
"completionTokens": 17,
"promptTokens": 70,
"totalTokens": 87
},
"finish_reason": "tool_calls",
"system_fingerprint": "fp_3aa7262c27"
},
"tool_calls": [
{
"name": "search",
"args": {
"query": "current weather in San Francisco"
},
"type": "tool_call",
"id": "call_iD5Wk4vPsTckffDKJpEQaMkg"
}
],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 70,
"output_tokens": 17,
"total_tokens": 87
}
},
ToolMessage {
"content": "Cold, with a low of 3℃",
"name": "search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "call_iD5Wk4vPsTckffDKJpEQaMkg"
},
AIMessage {
"id": "chatcmpl-9y660ZKNXvziVJze0X5aTlZ5IoN35",
"content": "Currently, in San Francisco, it's cold with a temperature of around 3℃ (37.4°F).",
"additional_kwargs": {},
"response_metadata": {
"tokenUsage": {
"completionTokens": 23,
"promptTokens": 103,
"totalTokens": 126
},
"finish_reason": "stop",
"system_fingerprint": "fp_3aa7262c27"
},
"tool_calls": [],
"invalid_tool_calls": [],
"usage_metadata": {
"input_tokens": 103,
"output_tokens": 23,
"total_tokens": 126
}
}
]
====