跳到内容

如何编辑图状态

人工参与 (HIL) 交互对于 自主系统 至关重要。手动更新图状态是一种常见的 HIL 交互模式,允许人类编辑操作(例如,正在调用什么工具或如何调用工具)。

我们可以在 LangGraph 中使用断点来实现这一点:断点允许我们在特定步骤之前中断图的执行。在这个断点,我们可以手动更新图状态,然后从该点恢复以继续。

image.png

设置

首先,我们需要安装所需的软件包

npm install @langchain/langgraph @langchain/anthropic @langchain/core zod

接下来,我们需要为 Anthropic(我们将使用的 LLM)设置 API 密钥

export ANTHROPIC_API_KEY=your-api-key

可选地,我们可以为 LangSmith 追踪设置 API 密钥,这将为我们提供一流的可观察性。

export LANGCHAIN_TRACING_V2="true"
export LANGCHAIN_CALLBACKS_BACKGROUND="true"
export LANGCHAIN_API_KEY=your-api-key

简单用法

让我们看看这个非常基本的使用方法。

下面,我们做两件事

1) 我们使用 interruptBefore 指定的步骤(节点)来指定断点

2) 我们设置一个 检查点 来保存图的状态,直到这个节点。

3) 我们使用 .updateState 来更新图的状态。

import { StateGraph, START, END, Annotation } from "@langchain/langgraph";
import { MemorySaver } from "@langchain/langgraph";

const GraphState = Annotation.Root({
  input: Annotation<string>
});

const step1 = (state: typeof GraphState.State) => {
  console.log("---Step 1---");
  return state;
}

const step2 = (state: typeof GraphState.State) => {
  console.log("---Step 2---");
  return state;
}

const step3 = (state: typeof GraphState.State) => {
  console.log("---Step 3---");
  return state;
}


const builder = new StateGraph(GraphState)
  .addNode("step1", step1)
  .addNode("step2", step2)
  .addNode("step3", step3)
  .addEdge(START, "step1")
  .addEdge("step1", "step2")
  .addEdge("step2", "step3")
  .addEdge("step3", END);


// Set up memory
const graphStateMemory = new MemorySaver()

const graph = builder.compile({
  checkpointer: graphStateMemory,
  interruptBefore: ["step2"]
});
import * as tslab from "tslab";

const drawableGraphGraphState = graph.getGraph();
const graphStateImage = await drawableGraphGraphState.drawMermaidPng();
const graphStateArrayBuffer = await graphStateImage.arrayBuffer();

await tslab.display.png(new Uint8Array(graphStateArrayBuffer));

// Input
const initialInput = { input: "hello world" };

// Thread
const graphStateConfig = { configurable: { thread_id: "1" }, streamMode: "values" as const };

// Run the graph until the first interruption
for await (const event of await graph.stream(initialInput, graphStateConfig)) {
    console.log(`--- ${event.input} ---`);
}

// Will log when the graph is interrupted, after step 2.
console.log("--- GRAPH INTERRUPTED ---");
--- hello world ---
---Step 1---
--- hello world ---
--- GRAPH INTERRUPTED ---
现在,我们可以手动更新我们的图状态 -

console.log("Current state!")
const currState = await graph.getState(graphStateConfig);
console.log(currState.values)

await graph.updateState(graphStateConfig, { input: "hello universe!" })

console.log("---\n---\nUpdated state!")
const updatedState = await graph.getState(graphStateConfig);
console.log(updatedState.values)
Current state!
{ input: 'hello world' }
---
---
Updated state!
{ input: 'hello universe!' }

// Continue the graph execution
for await (const event of await graph.stream(null, graphStateConfig)) {
    console.log(`--- ${event.input} ---`);
}
---Step 2---
--- hello universe! ---
---Step 3---
--- hello universe! ---

代理

在代理的上下文中,更新状态对于编辑工具调用等操作非常有用。

为了展示这一点,我们将构建一个相对简单的 ReAct 风格的代理,它执行工具调用。

我们将使用 Anthropic 的模型和一个虚假工具(仅用于演示目的)。

// Set up the tool
import { ChatAnthropic } from "@langchain/anthropic";
import { tool } from "@langchain/core/tools";
import { StateGraph, START, END, MessagesAnnotation } from "@langchain/langgraph";
import { MemorySaver } from "@langchain/langgraph";
import { ToolNode } from "@langchain/langgraph/prebuilt";
import { AIMessage } from "@langchain/core/messages";
import { z } from "zod";

const search = tool((_) => {
  return "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.";
}, {
  name: "search",
  description: "Call to surf the web.",
  schema: z.string(),
})

const tools = [search]
const toolNode = new ToolNode(tools)

// Set up the model
const model = new ChatAnthropic({ model: "claude-3-5-sonnet-20240620" })
const modelWithTools = model.bindTools(tools)


// Define nodes and conditional edges

// Define the function that determines whether to continue or not
function shouldContinue(state: typeof MessagesAnnotation.State): "action" | typeof END {
  const lastMessage = state.messages[state.messages.length - 1];
  // If there is no function call, then we finish
  if (lastMessage && !(lastMessage as AIMessage).tool_calls?.length) {
      return END;
  }
  // Otherwise if there is, we continue
  return "action";
}

// Define the function that calls the model
async function callModel(state: typeof MessagesAnnotation.State): Promise<Partial<typeof MessagesAnnotation.State>> {
  const messages = state.messages;
  const response = await modelWithTools.invoke(messages);
  // We return an object with a messages property, because this will get added to the existing list
  return { messages: [response] };
}

// Define a new graph
const workflow = new StateGraph(MessagesAnnotation)
  // Define the two nodes we will cycle between
  .addNode("agent", callModel)
  .addNode("action", toolNode)
  // We now add a conditional edge
  .addConditionalEdges(
      // First, we define the start node. We use `agent`.
      // This means these are the edges taken after the `agent` node is called.
      "agent",
      // Next, we pass in the function that will determine which node is called next.
      shouldContinue
  )
  // We now add a normal edge from `action` to `agent`.
  // This means that after `action` is called, `agent` node is called next.
  .addEdge("action", "agent")
  // Set the entrypoint as `agent`
  // This means that this node is the first one called
  .addEdge(START, "agent");

// Setup memory
const memory = new MemorySaver();

// Finally, we compile it!
// This compiles it into a LangChain Runnable,
// meaning you can use it as you would any other runnable
const app = workflow.compile({
  checkpointer: memory,
  interruptBefore: ["action"]
});
import * as tslab from "tslab";

const drawableGraph = app.getGraph();
const image = await drawableGraph.drawMermaidPng();
const arrayBuffer = await image.arrayBuffer();

await tslab.display.png(new Uint8Array(arrayBuffer));

与代理交互

我们现在可以与代理交互,看到它在调用工具之前停止。

// Thread
const config = { configurable: { thread_id: "3" }, streamMode: "values" as const };

for await (const event of await app.stream({
    messages: [{ role: "human", content: "search for the weather in sf now" }]
}, config)) {
    const recentMsg = event.messages[event.messages.length - 1];
    console.log(`================================ ${recentMsg._getType()} Message (1) =================================`)
    console.log(recentMsg.content);
}
================================ human Message (1) =================================
search for the weather in sf now
================================ ai Message (1) =================================
[
  {
    type: 'text',
    text: 'Certainly! I can help you search for the current weather in San Francisco. Let me use the search function to find that information for you.'
  },
  {
    type: 'tool_use',
    id: 'toolu_0141zTpknasyWkrjTV6eKeT6',
    name: 'search',
    input: { input: 'current weather in San Francisco' }
  }
]
编辑

我们现在可以相应地更新状态。让我们修改工具调用,使其查询 "旧金山当前天气"

// First, lets get the current state
const currentState = await app.getState(config);

// Let's now get the last message in the state
// This is the one with the tool calls that we want to update
let lastMessage = currentState.values.messages[currentState.values.messages.length - 1]

// Let's now update the args for that tool call
lastMessage.tool_calls[0].args = { query: "current weather in SF" }

// Let's now call `updateState` to pass in this message in the `messages` key
// This will get treated as any other update to the state
// It will get passed to the reducer function for the `messages` key
// That reducer function will use the ID of the message to update it
// It's important that it has the right ID! Otherwise it would get appended
// as a new message
await app.updateState(config, { messages: lastMessage });
{
  configurable: {
    thread_id: '3',
    checkpoint_id: '1ef5e785-4298-6b71-8002-4a6ceca964db'
  }
}
现在让我们检查应用程序的当前状态,以确保它已相应更新

const newState = await app.getState(config);
const updatedStateToolCalls = newState.values.messages[newState.values.messages.length -1 ].tool_calls
console.log(updatedStateToolCalls)
[
  {
    name: 'search',
    args: { query: 'current weather in SF' },
    id: 'toolu_0141zTpknasyWkrjTV6eKeT6',
    type: 'tool_call'
  }
]
恢复

我们现在可以再次调用代理,无需任何输入即可继续,即。按请求运行工具。我们可以从日志中看到它将更新参数传递给工具。

for await (const event of await app.stream(null, config)) {
    console.log(event)
    const recentMsg = event.messages[event.messages.length - 1];
    console.log(`================================ ${recentMsg._getType()} Message (1) =================================`)
    if (recentMsg._getType() === "tool") {
        console.log({
            name: recentMsg.name,
            content: recentMsg.content
        })
    } else if (recentMsg._getType() === "ai") {
        console.log(recentMsg.content)
    }
}
{
  messages: [
    HumanMessage {
      "id": "7c69c1f3-914b-4236-b2ca-ef250e72cb7a",
      "content": "search for the weather in sf now",
      "additional_kwargs": {},
      "response_metadata": {}
    },
    AIMessage {
      "id": "msg_0152mx7AweoRWa67HFsfyaif",
      "content": [
        {
          "type": "text",
          "text": "Certainly! I can help you search for the current weather in San Francisco. Let me use the search function to find that information for you."
        },
        {
          "type": "tool_use",
          "id": "toolu_0141zTpknasyWkrjTV6eKeT6",
          "name": "search",
          "input": {
            "input": "current weather in San Francisco"
          }
        }
      ],
      "additional_kwargs": {
        "id": "msg_0152mx7AweoRWa67HFsfyaif",
        "type": "message",
        "role": "assistant",
        "model": "claude-3-5-sonnet-20240620",
        "stop_reason": "tool_use",
        "stop_sequence": null,
        "usage": {
          "input_tokens": 380,
          "output_tokens": 84
        }
      },
      "response_metadata": {
        "id": "msg_0152mx7AweoRWa67HFsfyaif",
        "model": "claude-3-5-sonnet-20240620",
        "stop_reason": "tool_use",
        "stop_sequence": null,
        "usage": {
          "input_tokens": 380,
          "output_tokens": 84
        },
        "type": "message",
        "role": "assistant"
      },
      "tool_calls": [
        {
          "name": "search",
          "args": {
            "query": "current weather in SF"
          },
          "id": "toolu_0141zTpknasyWkrjTV6eKeT6",
          "type": "tool_call"
        }
      ],
      "invalid_tool_calls": []
    },
    ToolMessage {
      "id": "ccf0d56f-477f-408a-b809-6900a48379e0",
      "content": "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.",
      "name": "search",
      "additional_kwargs": {},
      "response_metadata": {},
      "tool_call_id": "toolu_0141zTpknasyWkrjTV6eKeT6"
    }
  ]
}
================================ tool Message (1) =================================
{
  name: 'search',
  content: "It's sunny in San Francisco, but you better look out if you're a Gemini 😈."
}
{
  messages: [
    HumanMessage {
      "id": "7c69c1f3-914b-4236-b2ca-ef250e72cb7a",
      "content": "search for the weather in sf now",
      "additional_kwargs": {},
      "response_metadata": {}
    },
    AIMessage {
      "id": "msg_0152mx7AweoRWa67HFsfyaif",
      "content": [
        {
          "type": "text",
          "text": "Certainly! I can help you search for the current weather in San Francisco. Let me use the search function to find that information for you."
        },
        {
          "type": "tool_use",
          "id": "toolu_0141zTpknasyWkrjTV6eKeT6",
          "name": "search",
          "input": {
            "input": "current weather in San Francisco"
          }
        }
      ],
      "additional_kwargs": {
        "id": "msg_0152mx7AweoRWa67HFsfyaif",
        "type": "message",
        "role": "assistant",
        "model": "claude-3-5-sonnet-20240620",
        "stop_reason": "tool_use",
        "stop_sequence": null,
        "usage": {
          "input_tokens": 380,
          "output_tokens": 84
        }
      },
      "response_metadata": {
        "id": "msg_0152mx7AweoRWa67HFsfyaif",
        "model": "claude-3-5-sonnet-20240620",
        "stop_reason": "tool_use",
        "stop_sequence": null,
        "usage": {
          "input_tokens": 380,
          "output_tokens": 84
        },
        "type": "message",
        "role": "assistant"
      },
      "tool_calls": [
        {
          "name": "search",
          "args": {
            "query": "current weather in SF"
          },
          "id": "toolu_0141zTpknasyWkrjTV6eKeT6",
          "type": "tool_call"
        }
      ],
      "invalid_tool_calls": []
    },
    ToolMessage {
      "id": "ccf0d56f-477f-408a-b809-6900a48379e0",
      "content": "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.",
      "name": "search",
      "additional_kwargs": {},
      "response_metadata": {},
      "tool_call_id": "toolu_0141zTpknasyWkrjTV6eKeT6"
    },
    AIMessage {
      "id": "msg_01YJXesUpaB5PfhgmRBCwnnb",
      "content": "Based on the search results, I can provide you with information about the current weather in San Francisco:\n\nThe weather in San Francisco is currently sunny. This means it's a clear day with plenty of sunshine. It's a great day to be outdoors or engage in activities that benefit from good weather.\n\nHowever, I should note that the search result included an unusual comment about Gemini zodiac signs. This appears to be unrelated to the weather and might be part of a joke or a reference to something else. For accurate and detailed weather information, I would recommend checking a reliable weather service or website for San Francisco.\n\nIs there anything else you'd like to know about the weather in San Francisco or any other information you need?",
      "additional_kwargs": {
        "id": "msg_01YJXesUpaB5PfhgmRBCwnnb",
        "type": "message",
        "role": "assistant",
        "model": "claude-3-5-sonnet-20240620",
        "stop_reason": "end_turn",
        "stop_sequence": null,
        "usage": {
          "input_tokens": 498,
          "output_tokens": 154
        }
      },
      "response_metadata": {
        "id": "msg_01YJXesUpaB5PfhgmRBCwnnb",
        "model": "claude-3-5-sonnet-20240620",
        "stop_reason": "end_turn",
        "stop_sequence": null,
        "usage": {
          "input_tokens": 498,
          "output_tokens": 154
        },
        "type": "message",
        "role": "assistant"
      },
      "tool_calls": [],
      "invalid_tool_calls": [],
      "usage_metadata": {
        "input_tokens": 498,
        "output_tokens": 154,
        "total_tokens": 652
      }
    }
  ]
}
================================ ai Message (1) =================================
Based on the search results, I can provide you with information about the current weather in San Francisco:

The weather in San Francisco is currently sunny. This means it's a clear day with plenty of sunshine. It's a great day to be outdoors or engage in activities that benefit from good weather.

However, I should note that the search result included an unusual comment about Gemini zodiac signs. This appears to be unrelated to the weather and might be part of a joke or a reference to something else. For accurate and detailed weather information, I would recommend checking a reliable weather service or website for San Francisco.

Is there anything else you'd like to know about the weather in San Francisco or any other information you need?