如何流式传输图的完整状态¶
LangGraph 支持多种流式传输模式。主要模式包括
values
:此流式传输模式会回传图的值。这是每个节点被调用后**图的完整状态**。updates
:此流式传输模式会回传图的更新。这是每个节点被调用后**图的状态更新**。
本指南介绍 streamMode="values"
。
在 [1]
已复制!
// process.env.OPENAI_API_KEY = "sk-...";
// process.env.OPENAI_API_KEY = "sk-...";
定义状态¶
状态是图中所有节点的接口。
在 [2]
已复制!
import { Annotation } from "@langchain/langgraph";
import { BaseMessage } from "@langchain/core/messages";
const StateAnnotation = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
});
import { Annotation } from "@langchain/langgraph"; import { BaseMessage } from "@langchain/core/messages"; const StateAnnotation = Annotation.Root({ messages: Annotation({ reducer: (x, y) => x.concat(y), }), });
在 [3]
已复制!
import { tool } from "@langchain/core/tools";
import { z } from "zod";
const searchTool = tool(async ({ query: _query }: { query: string }) => {
// This is a placeholder for the actual implementation
return "Cold, with a low of 3℃";
}, {
name: "search",
description:
"Use to surf the web, fetch current information, check the weather, and retrieve other information.",
schema: z.object({
query: z.string().describe("The query to use in your search."),
}),
});
await searchTool.invoke({ query: "What's the weather like?" });
const tools = [searchTool];
import { tool } from "@langchain/core/tools"; import { z } from "zod"; const searchTool = tool(async ({ query: _query }: { query: string }) => { // 这是实际实现的占位符 return "寒冷,最低气温 3℃"; }, { name: "search", description: "用于浏览网页、获取当前信息、查看天气和检索其他信息。", schema: z.object({ query: z.string().describe("搜索中使用的查询。"), }), }); await searchTool.invoke({ query: "天气怎么样?" }); const tools = [searchTool];
现在,我们可以将这些工具包装在一个简单的ToolNode中。此对象将在我们的 LLM 调用它们时实际运行这些工具(函数)。
在 [4]
已复制!
import { ToolNode } from "@langchain/langgraph/prebuilt";
const toolNode = new ToolNode(tools);
import { ToolNode } from "@langchain/langgraph/prebuilt"; const toolNode = new ToolNode(tools);
在 [5]
已复制!
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({ model: "gpt-4o" });
import { ChatOpenAI } from "@langchain/openai"; const model = new ChatOpenAI({ model: "gpt-4o" });
完成此操作后,我们应该确保模型知道可以使用这些工具进行调用。我们可以通过调用bindTools来实现。
在 [6]
已复制!
const boundModel = model.bindTools(tools);
const boundModel = model.bindTools(tools);
定义图¶
现在我们可以将所有内容整合在一起。
在 [7]
已复制!
import { END, START, StateGraph } from "@langchain/langgraph";
import { AIMessage } from "@langchain/core/messages";
const routeMessage = (state: typeof StateAnnotation.State) => {
const { messages } = state;
const lastMessage = messages[messages.length - 1] as AIMessage;
// If no tools are called, we can finish (respond to the user)
if (!lastMessage?.tool_calls?.length) {
return END;
}
// Otherwise if there is, we continue and call the tools
return "tools";
};
const callModel = async (
state: typeof StateAnnotation.State,
) => {
// For versions of @langchain/core < 0.2.3, you must call `.stream()`
// and aggregate the message from chunks instead of calling `.invoke()`.
const { messages } = state;
const responseMessage = await boundModel.invoke(messages);
return { messages: [responseMessage] };
};
const workflow = new StateGraph(StateAnnotation)
.addNode("agent", callModel)
.addNode("tools", toolNode)
.addEdge(START, "agent")
.addConditionalEdges("agent", routeMessage)
.addEdge("tools", "agent");
const graph = workflow.compile();
import { END, START, StateGraph } from "@langchain/langgraph"; import { AIMessage } from "@langchain/core/messages"; const routeMessage = (state: typeof StateAnnotation.State) => { const { messages } = state; const lastMessage = messages[messages.length - 1] as AIMessage; // 如果没有调用工具,我们可以结束(回复用户) if (!lastMessage?.tool_calls?.length) { return END; } // 否则,如果存在工具调用,我们继续并调用工具 return "tools"; }; const callModel = async ( state: typeof StateAnnotation.State, ) => { // 对于 @langchain/core 版本 < 0.2.3,您必须调用 `.stream()` // 并从块中聚合消息,而不是调用 `.invoke()`。 const { messages } = state; const responseMessage = await boundModel.invoke(messages); return { messages: [responseMessage] }; }; const workflow = new StateGraph(StateAnnotation) .addNode("agent", callModel) .addNode("tools", toolNode) .addEdge(START, "agent") .addConditionalEdges("agent", routeMessage) .addEdge("tools", "agent"); const graph = workflow.compile();
流式传输值¶
现在我们可以与代理交互。在交互之间,您可以获取和更新状态。
在 [8]
已复制!
let inputs = { messages: [{ role: "user", content: "what's the weather in sf" }] };
for await (
const chunk of await graph.stream(inputs, {
streamMode: "values",
})
) {
console.log(chunk["messages"]);
console.log("\n====\n");
}
let inputs = { messages: [{ role: "user", content: "sf 的天气怎么样" }] }; for await ( const chunk of await graph.stream(inputs, { streamMode: "values", }) ) { console.log(chunk["messages"]); console.log("\n====\n"); }
[ [ 'user', "what's the weather in sf" ] ] ==== [ [ 'user', "what's the weather in sf" ], AIMessage { "id": "chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_iD5Wk4vPsTckffDKJpEQaMkg", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 70, "totalTokens": 87 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_3aa7262c27" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_iD5Wk4vPsTckffDKJpEQaMkg" } ], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 70, "output_tokens": 17, "total_tokens": 87 } } ] ==== [ [ 'user', "what's the weather in sf" ], AIMessage { "id": "chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_iD5Wk4vPsTckffDKJpEQaMkg", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 70, "totalTokens": 87 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_3aa7262c27" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_iD5Wk4vPsTckffDKJpEQaMkg" } ], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 70, "output_tokens": 17, "total_tokens": 87 } }, ToolMessage { "content": "Cold, with a low of 3℃", "name": "search", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "call_iD5Wk4vPsTckffDKJpEQaMkg" } ] ==== [ [ 'user', "what's the weather in sf" ], AIMessage { "id": "chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_iD5Wk4vPsTckffDKJpEQaMkg", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 70, "totalTokens": 87 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_3aa7262c27" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_iD5Wk4vPsTckffDKJpEQaMkg" } ], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 70, "output_tokens": 17, "total_tokens": 87 } }, ToolMessage { "content": "Cold, with a low of 3℃", "name": "search", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "call_iD5Wk4vPsTckffDKJpEQaMkg" }, AIMessage { "id": "chatcmpl-9y660ZKNXvziVJze0X5aTlZ5IoN35", "content": "Currently, in San Francisco, it's cold with a temperature of around 3℃ (37.4°F).", "additional_kwargs": {}, "response_metadata": { "tokenUsage": { "completionTokens": 23, "promptTokens": 103, "totalTokens": 126 }, "finish_reason": "stop", "system_fingerprint": "fp_3aa7262c27" }, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 103, "output_tokens": 23, "total_tokens": 126 } } ] ====