如何在 Web 环境中使用 LangGraph.js¶
LangGraph.js 使用 async_hooks
API 来更方便地允许在节点内进行跟踪和回调传播。此 API 在许多环境中受支持,例如 Node.js、Deno、Cloudflare Workers 和 Edge 运行时,但并非所有环境都支持,例如在 Web 浏览器中。
为了允许在没有 async_hooks
API 可用的环境中使用 LangGraph.js,我们添加了一个单独的 @langchain/langgraph/web
入口点。此入口点导出主 @langchain/langgraph
导出的所有内容,但不会初始化甚至导入 async_hooks
。以下是一个简单的示例
// Import from "@langchain/langgraph/web"
import {
END,
START,
StateGraph,
Annotation,
} from "@langchain/langgraph/web";
import { BaseMessage, HumanMessage } from "@langchain/core/messages";
const GraphState = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
});
const nodeFn = async (_state: typeof GraphState.State) => {
return { messages: [new HumanMessage("Hello from the browser!")] };
};
// Define a new graph
const workflow = new StateGraph(GraphState)
.addNode("node", nodeFn)
.addEdge(START, "node")
.addEdge("node", END);
const app = workflow.compile({});
// Use the Runnable
const finalState = await app.invoke(
{ messages: [] },
);
console.log(finalState.messages[finalState.messages.length - 1].content);
Hello from the browser!
其他入口点,例如 @langchain/langgraph/prebuilt
,可以在任何环境中使用。
注意
如果您在前端使用 LangGraph.js,请确保您没有公开任何私钥!对于聊天模型,这意味着您需要使用类似于 WebLLM 的东西,它可以在客户端无身份验证的情况下运行。
传递配置¶
Web 浏览器中缺少 async_hooks
支持意味着,如果您在节点内调用 Runnable
(例如,当调用聊天模型时),您需要手动将 config
对象传递进去以正确支持跟踪、.streamEvents()
以流式传输中间步骤以及其他与回调相关的功能。此 config
对象将作为每个节点的第二个参数传入,并且应作为任何 Runnable
方法的第二个参数使用。
为了说明这一点,让我们像以前一样重新设置我们的图,但在我们的节点中使用 Runnable
。首先,我们将避免将 config
传递到嵌套函数中,然后尝试使用 .streamEvents()
来查看嵌套函数的中间结果
// Import from "@langchain/langgraph/web"
import {
END,
START,
StateGraph,
Annotation,
} from "@langchain/langgraph/web";
import { BaseMessage } from "@langchain/core/messages";
import { RunnableLambda } from "@langchain/core/runnables";
import { type StreamEvent } from "@langchain/core/tracers/log_stream";
const GraphState2 = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
});
const nodeFn2 = async (_state: typeof GraphState2.State) => {
// Note that we do not pass any `config` through here
const nestedFn = RunnableLambda.from(async (input: string) => {
return new HumanMessage(`Hello from ${input}!`);
}).withConfig({ runName: "nested" });
const responseMessage = await nestedFn.invoke("a nested function");
return { messages: [responseMessage] };
};
// Define a new graph
const workflow2 = new StateGraph(GraphState2)
.addNode("node", nodeFn2)
.addEdge(START, "node")
.addEdge("node", END);
const app2 = workflow2.compile({});
// Stream intermediate steps from the graph
const eventStream2 = app2.streamEvents(
{ messages: [] },
{ version: "v2" },
{ includeNames: ["nested"] },
);
const events2: StreamEvent[] = [];
for await (const event of eventStream2) {
console.log(event);
events2.push(event);
}
console.log(`Received ${events2.length} events from the nested function`);
Received 0 events from the nested function
我们可以看到我们没有收到任何事件。
接下来,让我们尝试使用正确传递 config 的节点重新声明图
// Import from "@langchain/langgraph/web"
import {
END,
START,
StateGraph,
Annotation,
} from "@langchain/langgraph/web";
import { BaseMessage } from "@langchain/core/messages";
import { type RunnableConfig, RunnableLambda } from "@langchain/core/runnables";
import { type StreamEvent } from "@langchain/core/tracers/log_stream";
const GraphState3 = Annotation.Root({
messages: Annotation<BaseMessage[]>({
reducer: (x, y) => x.concat(y),
}),
});
// Note the second argument here.
const nodeFn3 = async (_state: typeof GraphState3.State, config?: RunnableConfig) => {
// If you need to nest deeper, remember to pass `_config` when invoking
const nestedFn = RunnableLambda.from(
async (input: string, _config?: RunnableConfig) => {
return new HumanMessage(`Hello from ${input}!`);
},
).withConfig({ runName: "nested" });
const responseMessage = await nestedFn.invoke("a nested function", config);
return { messages: [responseMessage] };
};
// Define a new graph
const workflow3 = new StateGraph(GraphState3)
.addNode("node", nodeFn3)
.addEdge(START, "node")
.addEdge("node", END);
const app3 = workflow3.compile({});
// Stream intermediate steps from the graph
const eventStream3 = app3.streamEvents(
{ messages: [] },
{ version: "v2" },
{ includeNames: ["nested"] },
);
const events3: StreamEvent[] = [];
for await (const event of eventStream3) {
console.log(event);
events3.push(event);
}
console.log(`Received ${events3.length} events from the nested function`);
{ event: "on_chain_start", data: { input: { messages: [] } }, name: "nested", tags: [], run_id: "22747451-a2fa-447b-b62f-9da19a539b2f", metadata: { langgraph_step: 1, langgraph_node: "node", langgraph_triggers: [ "start:node" ], langgraph_task_idx: 0, __pregel_resuming: false, checkpoint_id: "1ef62793-f065-6840-fffe-cdfb4cbb1248", checkpoint_ns: "node" } } { event: "on_chain_end", data: { output: HumanMessage { "content": "Hello from a nested function!", "additional_kwargs": {}, "response_metadata": {} } }, run_id: "22747451-a2fa-447b-b62f-9da19a539b2f", name: "nested", tags: [], metadata: { langgraph_step: 1, langgraph_node: "node", langgraph_triggers: [ "start:node" ], langgraph_task_idx: 0, __pregel_resuming: false, checkpoint_id: "1ef62793-f065-6840-fffe-cdfb4cbb1248", checkpoint_ns: "node" } } Received 2 events from the nested function