设置¶
首先,我们需要安装所需的包
npm install @langchain/langgraph @langchain/anthropic @langchain/core zod
接下来,我们需要为 Anthropic(我们将使用的 LLM)设置 API 密钥
export ANTHROPIC_API_KEY=your-api-key
可选地,我们可以为 LangSmith 跟踪设置 API 密钥,这将为我们提供一流的可观测性。
export LANGCHAIN_TRACING_V2="true"
export LANGCHAIN_CALLBACKS_BACKGROUND="true"
export LANGCHAIN_API_KEY=your-api-key
简单使用¶
让我们看看它的非常基本的使用方法。
下面,我们做了两件事
在 [1]
已复制!
import { StateGraph, START, END, Annotation } from "@langchain/langgraph";
import { MemorySaver } from "@langchain/langgraph";
const GraphState = Annotation.Root({
input: Annotation<string>
});
const step1 = (state: typeof GraphState.State) => {
console.log("---Step 1---");
return state;
}
const step2 = (state: typeof GraphState.State) => {
console.log("---Step 2---");
return state;
}
const step3 = (state: typeof GraphState.State) => {
console.log("---Step 3---");
return state;
}
const builder = new StateGraph(GraphState)
.addNode("step1", step1)
.addNode("step2", step2)
.addNode("step3", step3)
.addEdge(START, "step1")
.addEdge("step1", "step2")
.addEdge("step2", "step3")
.addEdge("step3", END);
// Set up memory
const graphStateMemory = new MemorySaver()
const graph = builder.compile({
checkpointer: graphStateMemory,
interruptBefore: ["step2"]
});
import { StateGraph, START, END, Annotation } from "@langchain/langgraph"; import { MemorySaver } from "@langchain/langgraph"; const GraphState = Annotation.Root({ input: Annotation}); const step1 = (state: typeof GraphState.State) => { console.log("---Step 1---"); return state; } const step2 = (state: typeof GraphState.State) => { console.log("---Step 2---"); return state; } const step3 = (state: typeof GraphState.State) => { console.log("---Step 3---"); return state; } const builder = new StateGraph(GraphState) .addNode("step1", step1) .addNode("step2", step2) .addNode("step3", step3) .addEdge(START, "step1") .addEdge("step1", "step2") .addEdge("step2", "step3") .addEdge("step3", END); // 设置内存 const graphStateMemory = new MemorySaver() const graph = builder.compile({ checkpointer: graphStateMemory, interruptBefore: ["step2"] });
在 [2]
已复制!
import * as tslab from "tslab";
const drawableGraphGraphState = graph.getGraph();
const graphStateImage = await drawableGraphGraphState.drawMermaidPng();
const graphStateArrayBuffer = await graphStateImage.arrayBuffer();
await tslab.display.png(new Uint8Array(graphStateArrayBuffer));
import * as tslab from "tslab"; const drawableGraphGraphState = graph.getGraph(); const graphStateImage = await drawableGraphGraphState.drawMermaidPng(); const graphStateArrayBuffer = await graphStateImage.arrayBuffer(); await tslab.display.png(new Uint8Array(graphStateArrayBuffer));
在 [3]
已复制!
// Input
const initialInput = { input: "hello world" };
// Thread
const graphStateConfig = { configurable: { thread_id: "1" }, streamMode: "values" as const };
// Run the graph until the first interruption
for await (const event of await graph.stream(initialInput, graphStateConfig)) {
console.log(`--- ${event.input} ---`);
}
// Will log when the graph is interrupted, after step 2.
console.log("--- GRAPH INTERRUPTED ---");
// 输入 const initialInput = { input: "hello world" }; // 线程 const graphStateConfig = { configurable: { thread_id: "1" }, streamMode: "values" as const }; // 运行图直到第一次中断 for await (const event of await graph.stream(initialInput, graphStateConfig)) { console.log(`--- ${event.input} ---`); } // 将在图中断后记录,在步骤 2 之后。 console.log("--- 图中断 ---");
--- hello world --- ---Step 1--- --- hello world --- --- GRAPH INTERRUPTED ---
现在,我们可以手动更新我们的图状态 -
在 [4]
已复制!
console.log("Current state!")
const currState = await graph.getState(graphStateConfig);
console.log(currState.values)
await graph.updateState(graphStateConfig, { input: "hello universe!" })
console.log("---\n---\nUpdated state!")
const updatedState = await graph.getState(graphStateConfig);
console.log(updatedState.values)
console.log("当前状态!") const currState = await graph.getState(graphStateConfig); console.log(currState.values) await graph.updateState(graphStateConfig, { input: "hello universe!" }) console.log("---\n---\n已更新状态!") const updatedState = await graph.getState(graphStateConfig); console.log(updatedState.values)
Current state! { input: 'hello world' } --- --- Updated state! { input: 'hello universe!' }
在 [5]
已复制!
// Continue the graph execution
for await (const event of await graph.stream(null, graphStateConfig)) {
console.log(`--- ${event.input} ---`);
}
// 继续图执行 for await (const event of await graph.stream(null, graphStateConfig)) { console.log(`--- ${event.input} ---`); }
---Step 2--- --- hello universe! --- ---Step 3--- --- hello universe! ---
代理¶
在代理的上下文中,更新状态对于编辑工具调用等很有用。
为了展示这一点,我们将构建一个相对简单的 ReAct 风格的代理来执行工具调用。
我们将使用 Anthropic 的模型和一个假的工具(仅用于演示目的)。
在 [6]
已复制!
// Set up the tool
import { ChatAnthropic } from "@langchain/anthropic";
import { tool } from "@langchain/core/tools";
import { StateGraph, START, END, MessagesAnnotation } from "@langchain/langgraph";
import { MemorySaver } from "@langchain/langgraph";
import { ToolNode } from "@langchain/langgraph/prebuilt";
import { AIMessage } from "@langchain/core/messages";
import { z } from "zod";
const search = tool((_) => {
return "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.";
}, {
name: "search",
description: "Call to surf the web.",
schema: z.string(),
})
const tools = [search]
const toolNode = new ToolNode(tools)
// Set up the model
const model = new ChatAnthropic({ model: "claude-3-5-sonnet-20240620" })
const modelWithTools = model.bindTools(tools)
// Define nodes and conditional edges
// Define the function that determines whether to continue or not
function shouldContinue(state: typeof MessagesAnnotation.State): "action" | typeof END {
const lastMessage = state.messages[state.messages.length - 1];
// If there is no function call, then we finish
if (lastMessage && !(lastMessage as AIMessage).tool_calls?.length) {
return END;
}
// Otherwise if there is, we continue
return "action";
}
// Define the function that calls the model
async function callModel(state: typeof MessagesAnnotation.State): Promise<Partial<typeof MessagesAnnotation.State>> {
const messages = state.messages;
const response = await modelWithTools.invoke(messages);
// We return an object with a messages property, because this will get added to the existing list
return { messages: [response] };
}
// Define a new graph
const workflow = new StateGraph(MessagesAnnotation)
// Define the two nodes we will cycle between
.addNode("agent", callModel)
.addNode("action", toolNode)
// We now add a conditional edge
.addConditionalEdges(
// First, we define the start node. We use `agent`.
// This means these are the edges taken after the `agent` node is called.
"agent",
// Next, we pass in the function that will determine which node is called next.
shouldContinue
)
// We now add a normal edge from `action` to `agent`.
// This means that after `action` is called, `agent` node is called next.
.addEdge("action", "agent")
// Set the entrypoint as `agent`
// This means that this node is the first one called
.addEdge(START, "agent");
// Setup memory
const memory = new MemorySaver();
// Finally, we compile it!
// This compiles it into a LangChain Runnable,
// meaning you can use it as you would any other runnable
const app = workflow.compile({
checkpointer: memory,
interruptBefore: ["action"]
});
// 设置工具 import { ChatAnthropic } from "@langchain/anthropic"; import { tool } from "@langchain/core/tools"; import { StateGraph, START, END, MessagesAnnotation } from "@langchain/langgraph"; import { MemorySaver } from "@langchain/langgraph"; import { ToolNode } from "@langchain/langgraph/prebuilt"; import { AIMessage } from "@langchain/core/messages"; import { z } from "zod"; const search = tool((_) => { return "It's sunny in San Francisco, but you better look out if you're a Gemini 😈."; }, { name: "search", description: "调用浏览网页。", schema: z.string(), }) const tools = [search] const toolNode = new ToolNode(tools) // 设置模型 const model = new ChatAnthropic({ model: "claude-3-5-sonnet-20240620" }) const modelWithTools = model.bindTools(tools) // 定义节点和条件边 // 定义一个函数来确定是否继续或停止 function shouldContinue(state: typeof MessagesAnnotation.State): "action" | typeof END { const lastMessage = state.messages[state.messages.length - 1]; // 如果没有函数调用,那么我们结束 if (lastMessage && !(lastMessage as AIMessage).tool_calls?.length) { return END; } // 否则,如果有,我们继续 return "action"; } // 定义一个函数来调用模型 async function callModel(state: typeof MessagesAnnotation.State): Promise> { const messages = state.messages; const response = await modelWithTools.invoke(messages); // 我们返回一个带有 messages 属性的对象,因为这将被添加到现有列表中 return { messages: [response] }; } // 定义一个新的图 const workflow = new StateGraph(MessagesAnnotation) // 定义我们将循环遍历的两个节点 .addNode("agent", callModel) .addNode("action", toolNode) // 我们现在添加一个条件边 .addConditionalEdges( // 首先,我们定义开始节点。我们使用 `agent`。 // 这意味着这些是在调用 `agent` 节点后采取的边。 "agent", // 接下来,我们传入一个函数,该函数将确定调用哪个节点。 shouldContinue ) // 我们现在从 `action` 添加一个普通边到 `agent`。 // 这意味着在调用 `action` 之后,`agent` 节点将被下一个调用。 .addEdge("action", "agent") // 设置入口点为 `agent` // 这意味着此节点是第一个被调用的节点 .addEdge(START, "agent"); // 设置内存 const memory = new MemorySaver(); // 最后,我们进行编译! // 这将其编译成一个 LangChain 可执行文件, // 意味着你可以像使用任何其他可执行文件一样使用它 const app = workflow.compile({ checkpointer: memory, interruptBefore: ["action"] });
在 [7]
已复制!
import * as tslab from "tslab";
const drawableGraph = app.getGraph();
const image = await drawableGraph.drawMermaidPng();
const arrayBuffer = await image.arrayBuffer();
await tslab.display.png(new Uint8Array(arrayBuffer));
import * as tslab from "tslab"; const drawableGraph = app.getGraph(); const image = await drawableGraph.drawMermaidPng(); const arrayBuffer = await image.arrayBuffer(); await tslab.display.png(new Uint8Array(arrayBuffer));
与代理交互¶
我们现在可以与代理交互,并看到它在调用工具之前停止。
在 [8]
已复制!
// Thread
const config = { configurable: { thread_id: "3" }, streamMode: "values" as const };
for await (const event of await app.stream({
messages: [{ role: "human", content: "search for the weather in sf now" }]
}, config)) {
const recentMsg = event.messages[event.messages.length - 1];
console.log(`================================ ${recentMsg._getType()} Message (1) =================================`)
console.log(recentMsg.content);
}
// 线程 const config = { configurable: { thread_id: "3" }, streamMode: "values" as const }; for await (const event of await app.stream({ messages: [{ role: "human", content: "search for the weather in sf now" }] }, config)) { const recentMsg = event.messages[event.messages.length - 1]; console.log(`================================ ${recentMsg._getType()} 消息 (1) =================================`) console.log(recentMsg.content); }
================================ human Message (1) ================================= search for the weather in sf now ================================ ai Message (1) ================================= [ { type: 'text', text: 'Certainly! I can help you search for the current weather in San Francisco. Let me use the search function to find that information for you.' }, { type: 'tool_use', id: 'toolu_0141zTpknasyWkrjTV6eKeT6', name: 'search', input: { input: 'current weather in San Francisco' } } ]
编辑
我们现在可以相应地更新状态。让我们修改工具调用,使其具有查询 "current weather in SF"
。
在 [9]
已复制!
// First, lets get the current state
const currentState = await app.getState(config);
// Let's now get the last message in the state
// This is the one with the tool calls that we want to update
let lastMessage = currentState.values.messages[currentState.values.messages.length - 1]
// Let's now update the args for that tool call
lastMessage.tool_calls[0].args = { query: "current weather in SF" }
// Let's now call `updateState` to pass in this message in the `messages` key
// This will get treated as any other update to the state
// It will get passed to the reducer function for the `messages` key
// That reducer function will use the ID of the message to update it
// It's important that it has the right ID! Otherwise it would get appended
// as a new message
await app.updateState(config, { messages: lastMessage });
// 首先,让我们获取当前状态 const currentState = await app.getState(config); // 现在让我们获取状态中的最后一条消息 // 这是我们要更新的包含工具调用的消息 let lastMessage = currentState.values.messages[currentState.values.messages.length - 1] // 现在让我们更新该工具调用的参数 lastMessage.tool_calls[0].args = { query: "current weather in SF" } // 现在让我们调用 `updateState` 以在 `messages` 键中传入此消息 // 这将被视为对状态的任何其他更新 // 它将被传递到 `messages` 键的 reducer 函数 // 该 reducer 函数将使用消息的 ID 来更新它 // 重要的是它具有正确的 ID!否则它将被附加 // 作为一个新的消息 await app.updateState(config, { messages: lastMessage });
{ configurable: { thread_id: '3', checkpoint_id: '1ef5e785-4298-6b71-8002-4a6ceca964db' } }
现在让我们检查应用程序的当前状态,以确保它已相应更新
在 [10]
已复制!
const newState = await app.getState(config);
const updatedStateToolCalls = newState.values.messages[newState.values.messages.length -1 ].tool_calls
console.log(updatedStateToolCalls)
const newState = await app.getState(config); const updatedStateToolCalls = newState.values.messages[newState.values.messages.length -1 ].tool_calls console.log(updatedStateToolCalls)
[ { name: 'search', args: { query: 'current weather in SF' }, id: 'toolu_0141zTpknasyWkrjTV6eKeT6', type: 'tool_call' } ]
恢复
我们现在可以再次调用代理,没有任何输入来继续,即运行所请求的工具。我们可以从日志中看到它将更新的参数传递给工具。
在 [11]
已复制!
for await (const event of await app.stream(null, config)) {
console.log(event)
const recentMsg = event.messages[event.messages.length - 1];
console.log(`================================ ${recentMsg._getType()} Message (1) =================================`)
if (recentMsg._getType() === "tool") {
console.log({
name: recentMsg.name,
content: recentMsg.content
})
} else if (recentMsg._getType() === "ai") {
console.log(recentMsg.content)
}
}
for await (const event of await app.stream(null, config)) { console.log(event) const recentMsg = event.messages[event.messages.length - 1]; console.log(`================================ ${recentMsg._getType()} 消息 (1) =================================`) if (recentMsg._getType() === "tool") { console.log({ name: recentMsg.name, content: recentMsg.content }) } else if (recentMsg._getType() === "ai") { console.log(recentMsg.content) } }
{ messages: [ HumanMessage { "id": "7c69c1f3-914b-4236-b2ca-ef250e72cb7a", "content": "search for the weather in sf now", "additional_kwargs": {}, "response_metadata": {} }, AIMessage { "id": "msg_0152mx7AweoRWa67HFsfyaif", "content": [ { "type": "text", "text": "Certainly! I can help you search for the current weather in San Francisco. Let me use the search function to find that information for you." }, { "type": "tool_use", "id": "toolu_0141zTpknasyWkrjTV6eKeT6", "name": "search", "input": { "input": "current weather in San Francisco" } } ], "additional_kwargs": { "id": "msg_0152mx7AweoRWa67HFsfyaif", "type": "message", "role": "assistant", "model": "claude-3-5-sonnet-20240620", "stop_reason": "tool_use", "stop_sequence": null, "usage": { "input_tokens": 380, "output_tokens": 84 } }, "response_metadata": { "id": "msg_0152mx7AweoRWa67HFsfyaif", "model": "claude-3-5-sonnet-20240620", "stop_reason": "tool_use", "stop_sequence": null, "usage": { "input_tokens": 380, "output_tokens": 84 }, "type": "message", "role": "assistant" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in SF" }, "id": "toolu_0141zTpknasyWkrjTV6eKeT6", "type": "tool_call" } ], "invalid_tool_calls": [] }, ToolMessage { "id": "ccf0d56f-477f-408a-b809-6900a48379e0", "content": "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.", "name": "search", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "toolu_0141zTpknasyWkrjTV6eKeT6" } ] } ================================ tool Message (1) ================================= { name: 'search', content: "It's sunny in San Francisco, but you better look out if you're a Gemini 😈." } { messages: [ HumanMessage { "id": "7c69c1f3-914b-4236-b2ca-ef250e72cb7a", "content": "search for the weather in sf now", "additional_kwargs": {}, "response_metadata": {} }, AIMessage { "id": "msg_0152mx7AweoRWa67HFsfyaif", "content": [ { "type": "text", "text": "Certainly! I can help you search for the current weather in San Francisco. Let me use the search function to find that information for you." }, { "type": "tool_use", "id": "toolu_0141zTpknasyWkrjTV6eKeT6", "name": "search", "input": { "input": "current weather in San Francisco" } } ], "additional_kwargs": { "id": "msg_0152mx7AweoRWa67HFsfyaif", "type": "message", "role": "assistant", "model": "claude-3-5-sonnet-20240620", "stop_reason": "tool_use", "stop_sequence": null, "usage": { "input_tokens": 380, "output_tokens": 84 } }, "response_metadata": { "id": "msg_0152mx7AweoRWa67HFsfyaif", "model": "claude-3-5-sonnet-20240620", "stop_reason": "tool_use", "stop_sequence": null, "usage": { "input_tokens": 380, "output_tokens": 84 }, "type": "message", "role": "assistant" }, "tool_calls": [ { "name": "search", "args": { "query": "current weather in SF" }, "id": "toolu_0141zTpknasyWkrjTV6eKeT6", "type": "tool_call" } ], "invalid_tool_calls": [] }, ToolMessage { "id": "ccf0d56f-477f-408a-b809-6900a48379e0", "content": "It's sunny in San Francisco, but you better look out if you're a Gemini 😈.", "name": "search", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "toolu_0141zTpknasyWkrjTV6eKeT6" }, AIMessage { "id": "msg_01YJXesUpaB5PfhgmRBCwnnb", "content": "Based on the search results, I can provide you with information about the current weather in San Francisco:\n\nThe weather in San Francisco is currently sunny. This means it's a clear day with plenty of sunshine. It's a great day to be outdoors or engage in activities that benefit from good weather.\n\nHowever, I should note that the search result included an unusual comment about Gemini zodiac signs. This appears to be unrelated to the weather and might be part of a joke or a reference to something else. For accurate and detailed weather information, I would recommend checking a reliable weather service or website for San Francisco.\n\nIs there anything else you'd like to know about the weather in San Francisco or any other information you need?", "additional_kwargs": { "id": "msg_01YJXesUpaB5PfhgmRBCwnnb", "type": "message", "role": "assistant", "model": "claude-3-5-sonnet-20240620", "stop_reason": "end_turn", "stop_sequence": null, "usage": { "input_tokens": 498, "output_tokens": 154 } }, "response_metadata": { "id": "msg_01YJXesUpaB5PfhgmRBCwnnb", "model": "claude-3-5-sonnet-20240620", "stop_reason": "end_turn", "stop_sequence": null, "usage": { "input_tokens": 498, "output_tokens": 154 }, "type": "message", "role": "assistant" }, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 498, "output_tokens": 154, "total_tokens": 652 } } ] } ================================ ai Message (1) ================================= Based on the search results, I can provide you with information about the current weather in San Francisco: The weather in San Francisco is currently sunny. This means it's a clear day with plenty of sunshine. It's a great day to be outdoors or engage in activities that benefit from good weather. However, I should note that the search result included an unusual comment about Gemini zodiac signs. This appears to be unrelated to the weather and might be part of a joke or a reference to something else. For accurate and detailed weather information, I would recommend checking a reliable weather service or website for San Francisco. Is there anything else you'd like to know about the weather in San Francisco or any other information you need?