如何查看和更新过去的图状态¶
一旦您开始检查您的图,您就可以轻松地获取或更新代理在任何时间点的状态。这允许一些事情
- 您可以在中断期间向用户显示状态,以让他们接受操作。
- 您可以倒回图以重现或避免问题。
- 您可以修改状态以将您的代理嵌入到更大的系统中,或让用户更好地控制其操作。
此功能使用的关键方法是
- getState:从目标配置中获取值
- updateState:将给定值应用于目标状态
注意:这需要传入检查点。
这适用于StateGraph及其所有子类,例如MessageGraph。
下面是一个例子。
注意
在本操作指南中,我们将从头开始创建我们的代理以实现透明(但冗长)。您可以使用createReactAgent(model, tools=tool, checkpointer=checkpointer)
(API 文档)构造函数来实现类似的功能。如果您习惯使用 LangChain 的AgentExecutor类,这可能更合适。
设置¶
本指南将使用 OpenAI 的 GPT-4o 模型。我们将在可选情况下设置我们的 API 密钥以进行LangSmith 追踪,这将为我们提供一流的可观察性。
// process.env.OPENAI_API_KEY = "sk_...";
// Optional, add tracing in LangSmith
// process.env.LANGCHAIN_API_KEY = "ls__...";
process.env.LANGCHAIN_CALLBACKS_BACKGROUND = "true";
process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_PROJECT = "Time Travel: LangGraphJS";
Time Travel: LangGraphJS
定义状态¶
状态是我们图中所有节点的接口。
import { Annotation } from "@langchain/langgraph";
import { BaseMessage } from "@langchain/core/messages";
const StateAnnotation = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
});
import { tool } from "@langchain/core/tools";
import { z } from "zod";
const searchTool = tool(async (_) => {
// This is a placeholder for the actual implementation
return "Cold, with a low of 13 ℃";
}, {
name: "search",
description:
"Use to surf the web, fetch current information, check the weather, and retrieve other information.",
schema: z.object({
query: z.string().describe("The query to use in your search."),
}),
});
await searchTool.invoke({ query: "What's the weather like?" });
const tools = [searchTool];
现在,我们可以将这些工具包装在一个简单的ToolNode中。每当我们的 LLM 调用这些对象时,它实际上都会运行这些工具(函数)。
import { ToolNode } from "@langchain/langgraph/prebuilt";
const toolNode = new ToolNode(tools);
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({ model: "gpt-4o" });
完成此操作后,我们应该确保模型知道它可以使用这些工具进行调用。我们可以通过调用bindTools来实现。
const boundModel = model.bindTools(tools);
定义图¶
现在我们可以将所有内容放在一起。时间旅行需要检查点来保存状态 - 否则您将无法进行get
或update
操作。我们将使用MemorySaver,它在内存中“保存”检查点。
import { END, START, StateGraph } from "@langchain/langgraph";
import { AIMessage } from "@langchain/core/messages";
import { RunnableConfig } from "@langchain/core/runnables";
import { MemorySaver } from "@langchain/langgraph";
const routeMessage = (state: typeof StateAnnotation.State) => {
const { messages } = state;
const lastMessage = messages[messages.length - 1] as AIMessage;
// If no tools are called, we can finish (respond to the user)
if (!lastMessage?.tool_calls?.length) {
return END;
}
// Otherwise if there is, we continue and call the tools
return "tools";
};
const callModel = async (
state: typeof StateAnnotation.State,
config?: RunnableConfig,
) => {
const { messages } = state;
const response = await boundModel.invoke(messages, config);
return { messages: [response] };
};
const workflow = new StateGraph(StateAnnotation)
.addNode("agent", callModel)
.addNode("tools", toolNode)
.addEdge(START, "agent")
.addConditionalEdges("agent", routeMessage)
.addEdge("tools", "agent");
// Here we only save in-memory
let memory = new MemorySaver();
const graph = workflow.compile({ checkpointer: memory });
与 Agent 交互¶
现在我们可以与代理交互。在交互之间,您可以获取和更新状态。
let config = { configurable: { thread_id: "conversation-num-1" } };
let inputs = { messages: [{ role: "user", content: "Hi I'm Jo." }] };
for await (
const { messages } of await graph.stream(inputs, {
...config,
streamMode: "values",
})
) {
let msg = messages[messages?.length - 1];
if (msg?.content) {
console.log(msg.content);
} else if (msg?.tool_calls?.length > 0) {
console.log(msg.tool_calls);
} else {
console.log(msg);
}
console.log("-----\n");
}
Hi I'm Jo. ----- Hello, Jo! How can I assist you today? -----
在此处查看 LangSmith 示例运行https://smith.langchain.com/public/b3feb09b-bcd2-4ad5-ad1d-414106148448/r
在这里您可以看到“agent”节点已运行,然后我们的边返回了__end__
,因此图在此处停止了执行。
让我们检查当前的图状态。
let checkpoint = await graph.getState(config);
checkpoint.values;
{ messages: [ { role: 'user', content: "Hi I'm Jo." }, AIMessage { "id": "chatcmpl-A3FGf3k3QQo9q0QjT6Oc5h1XplkHr", "content": "Hello, Jo! How can I assist you today?", "additional_kwargs": {}, "response_metadata": { "tokenUsage": { "completionTokens": 12, "promptTokens": 68, "totalTokens": 80 }, "finish_reason": "stop", "system_fingerprint": "fp_fde2829a40" }, "tool_calls": [], "invalid_tool_calls": [] } ] }
当前状态是我们上面看到的两条消息,1. 我们发送的 HumanMessage,2. 我们从模型收到的 AIMessage。
next
值为空,因为图已终止(转换到__end__
)。
checkpoint.next;
[]
让它执行工具¶
当我们再次调用图时,它将在每个内部执行步骤之后创建一个检查点。让我们让它运行一个工具,然后查看检查点。
inputs = { messages: [{ role: "user", content: "What's the weather like in SF currently?" }] };
for await (
const { messages } of await graph.stream(inputs, {
...config,
streamMode: "values",
})
) {
let msg = messages[messages?.length - 1];
if (msg?.content) {
console.log(msg.content);
} else if (msg?.tool_calls?.length > 0) {
console.log(msg.tool_calls);
} else {
console.log(msg);
}
console.log("-----\n");
}
What's the weather like in SF currently? -----
[ { name: 'search', args: { query: 'current weather in San Francisco' }, type: 'tool_call', id: 'call_ZtmtDOyEXDCnXDgowlit5dSd' } ] ----- Cold, with a low of 13 ℃ ----- The current weather in San Francisco is cold, with a low of 13°C. -----
在此处查看上述执行的跟踪:https://smith.langchain.com/public/0ef426fd-0da1-4c02-a50b-64ae1e68338e/r我们可以看到它计划了工具执行(即“agent”节点),然后“should_continue”边返回“continue”,因此我们继续执行“action”节点,该节点执行了工具,然后“agent”节点发出了最终响应,这使得“should_continue”边返回“end”。让我们看看我们如何可以更好地控制它。
在工具之前暂停¶
如果您注意到下面,我们现在将添加interruptBefore=["action"]
- 这意味着在采取任何操作之前,我们将暂停。这是一个让用户更正和更新状态的好时机!当您希望让人机交互来验证(并可能更改)要采取的操作时,这非常有用。
memory = new MemorySaver();
const graphWithInterrupt = workflow.compile({
checkpointer: memory,
interruptBefore: ["tools"],
});
inputs = { messages: [{ role: "user", content: "What's the weather like in SF currently?" }] };
for await (
const { messages } of await graphWithInterrupt.stream(inputs, {
...config,
streamMode: "values",
})
) {
let msg = messages[messages?.length - 1];
if (msg?.content) {
console.log(msg.content);
} else if (msg?.tool_calls?.length > 0) {
console.log(msg.tool_calls);
} else {
console.log(msg);
}
console.log("-----\n");
}
What's the weather like in SF currently? ----- [ { name: 'search', args: { query: 'current weather in San Francisco' }, type: 'tool_call', id: 'call_OsKnTv2psf879eeJ9vx5GeoY' } ] -----
获取状态¶
您可以使用getState(config)
获取最新的图检查点。
let snapshot = await graphWithInterrupt.getState(config);
snapshot.next;
[ 'tools' ]
恢复¶
您可以通过使用null
输入运行图来恢复。检查点被加载,并且没有新的输入,它将执行,就好像没有发生中断一样。
for await (
const { messages } of await graphWithInterrupt.stream(null, {
...snapshot.config,
streamMode: "values",
})
) {
let msg = messages[messages?.length - 1];
if (msg?.content) {
console.log(msg.content);
} else if (msg?.tool_calls?.length > 0) {
console.log(msg.tool_calls);
} else {
console.log(msg);
}
console.log("-----\n");
}
Cold, with a low of 13 ℃ ----- Currently, it is cold in San Francisco, with a temperature around 13°C (55°F). -----
查看完整历史¶
让我们浏览一下此线程的历史记录,从最新到最旧。
let toReplay;
const states = await graphWithInterrupt.getStateHistory(config);
for await (const state of states) {
console.log(state);
console.log("--");
if (state.values?.messages?.length === 2) {
toReplay = state;
}
}
if (!toReplay) {
throw new Error("No state to replay");
}
{ values: { messages: [ [Object], AIMessage { "id": "chatcmpl-A3FGhKzOZs0GYZ2yalNOCQZyPgbcp", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_OsKnTv2psf879eeJ9vx5GeoY", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 72, "totalTokens": 89 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_fde2829a40" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_OsKnTv2psf879eeJ9vx5GeoY" } ], "invalid_tool_calls": [] }, ToolMessage { "content": "Cold, with a low of 13 ℃", "name": "search", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "call_OsKnTv2psf879eeJ9vx5GeoY" }, AIMessage { "id": "chatcmpl-A3FGiYripPKtQLnAK1H3hWLSXQfOD", "content": "Currently, it is cold in San Francisco, with a temperature around 13°C (55°F).", "additional_kwargs": {}, "response_metadata": { "tokenUsage": { "completionTokens": 21, "promptTokens": 105, "totalTokens": 126 }, "finish_reason": "stop", "system_fingerprint": "fp_fde2829a40" }, "tool_calls": [], "invalid_tool_calls": [] } ] }, next: [], tasks: [], metadata: { source: 'loop', writes: { agent: [Object] }, step: 3 }, config: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-9c3a-6bd1-8003-d7f030ff72b2' } }, createdAt: '2024-09-03T04:17:20.653Z', parentConfig: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-9516-6200-8002-43d2c6dc603f' } } } -- { values: { messages: [ [Object], AIMessage { "id": "chatcmpl-A3FGhKzOZs0GYZ2yalNOCQZyPgbcp", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_OsKnTv2psf879eeJ9vx5GeoY", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 72, "totalTokens": 89 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_fde2829a40" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_OsKnTv2psf879eeJ9vx5GeoY" } ], "invalid_tool_calls": [] }, ToolMessage { "content": "Cold, with a low of 13 ℃", "name": "search", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "call_OsKnTv2psf879eeJ9vx5GeoY" } ] }, next: [ 'agent' ], tasks: [ { id: '612efffa-3b16-530f-8a39-fd01c31e7b8b', name: 'agent', interrupts: [] } ], metadata: { source: 'loop', writes: { tools: [Object] }, step: 2 }, config: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-9516-6200-8002-43d2c6dc603f' } }, createdAt: '2024-09-03T04:17:19.904Z', parentConfig: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-9455-6410-8001-1c78a97f63e6' } } } -- { values: { messages: [ [Object], AIMessage { "id": "chatcmpl-A3FGhKzOZs0GYZ2yalNOCQZyPgbcp", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_OsKnTv2psf879eeJ9vx5GeoY", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 72, "totalTokens": 89 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_fde2829a40" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_OsKnTv2psf879eeJ9vx5GeoY" } ], "invalid_tool_calls": [] } ] }, next: [ 'tools' ], tasks: [ { id: '767116b0-55b6-5af4-8f74-ce45fb6e31ed', name: 'tools', interrupts: [] } ], metadata: { source: 'loop', writes: { agent: [Object] }, step: 1 }, config: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-9455-6410-8001-1c78a97f63e6' } }, createdAt: '2024-09-03T04:17:19.825Z', parentConfig: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-8c4b-6261-8000-c51e5807fbcd' } } } -- { values: { messages: [ [Object] ] }, next: [ 'agent' ], tasks: [ { id: '5b0ed7d1-1bb7-5d78-b4fc-7a8ed40e7291', name: 'agent', interrupts: [] } ], metadata: { source: 'loop', writes: null, step: 0 }, config: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-8c4b-6261-8000-c51e5807fbcd' } }, createdAt: '2024-09-03T04:17:18.982Z', parentConfig: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-8c4b-6260-ffff-6ec582916c42' } } } -- { values: {}, next: [ '__start__' ], tasks: [ { id: 'a4250d5c-d025-5da1-b588-cae2b3f4a8c7', name: '__start__', interrupts: [] } ], metadata: { source: 'input', writes: { messages: [Array] }, step: -1 }, config: { configurable: { thread_id: 'conversation-num-1', checkpoint_ns: '', checkpoint_id: '1ef69ab6-8c4b-6260-ffff-6ec582916c42' } }, createdAt: '2024-09-03T04:17:18.982Z', parentConfig: undefined } --
回放过去的状态¶
要从这个位置回放,我们只需要将它的配置传回给代理。
for await (
const { messages } of await graphWithInterrupt.stream(null, {
...toReplay.config,
streamMode: "values",
})
) {
let msg = messages[messages?.length - 1];
if (msg?.content) {
console.log(msg.content);
} else if (msg?.tool_calls?.length > 0) {
console.log(msg.tool_calls);
} else {
console.log(msg);
}
console.log("-----\n");
}
Cold, with a low of 13 ℃ ----- The current weather in San Francisco is cold, with a low of 13°C. -----
const tool_calls =
toReplay.values.messages[toReplay.values.messages.length - 1].tool_calls;
const branchConfig = await graphWithInterrupt.updateState(
toReplay.config,
{
messages: [
{ role: "tool", content: "It's sunny out, with a high of 38 ℃.", tool_call_id: tool_calls[0].id },
],
},
// Updates are applied "as if" they were coming from a node. By default,
// the updates will come from the last node to run. In our case, we want to treat
// this update as if it came from the tools node, so that the next node to run will be
// the agent.
"tools",
);
const branchState = await graphWithInterrupt.getState(branchConfig);
console.log(branchState.values);
console.log(branchState.next);
{ messages: [ { role: 'user', content: "What's the weather like in SF currently?" }, AIMessage { "id": "chatcmpl-A3FGhKzOZs0GYZ2yalNOCQZyPgbcp", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_OsKnTv2psf879eeJ9vx5GeoY", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 17, "promptTokens": 72, "totalTokens": 89 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_fde2829a40" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in San Francisco" }, "type": "tool_call", "id": "call_OsKnTv2psf879eeJ9vx5GeoY" } ], "invalid_tool_calls": [] }, { role: 'tool', content: "It's sunny out, with a high of 38 ℃.", tool_call_id: 'call_OsKnTv2psf879eeJ9vx5GeoY' } ] } [ 'agent' ]
现在您可以从此分支运行¶
只需使用更新的配置(包含新的检查点 ID)。轨迹将遵循新的分支。
for await (
const { messages } of await graphWithInterrupt.stream(null, {
...branchConfig,
streamMode: "values",
})
) {
let msg = messages[messages?.length - 1];
if (msg?.content) {
console.log(msg.content);
} else if (msg?.tool_calls?.length > 0) {
console.log(msg.tool_calls);
} else {
console.log(msg);
}
console.log("-----\n");
}
The current weather in San Francisco is sunny, with a high of 38°C. -----