如何同时配置多种流模式¶
本指南介绍如何同时配置多种流模式。
设置¶
首先,我们需要安装所需的软件包
npm install @langchain/langgraph @langchain/openai @langchain/core
接下来,我们需要为 OpenAI(我们将使用的 LLM)设置 API 密钥
export OPENAI_API_KEY=your-api-key
可选地,我们可以为 LangSmith 追踪 设置 API 密钥,这将为我们提供一流的可观察性。
export LANGCHAIN_TRACING_V2="true"
export LANGCHAIN_CALLBACKS_BACKGROUND="true"
export LANGCHAIN_API_KEY=your-api-key
定义图¶
在本指南中,我们将使用预构建的 ReAct 代理。
在 [1]
已复制!
import { ChatOpenAI } from "@langchain/openai";
import { tool } from '@langchain/core/tools';
import { z } from 'zod';
import { createReactAgent } from "@langchain/langgraph/prebuilt";
const model = new ChatOpenAI({
model: "gpt-4o",
});
const getWeather = tool((input) => {
if (["sf", "san francisco", "san francisco, ca"].includes(input.location.toLowerCase())) {
return "It's 60 degrees and foggy.";
} else {
return "It's 90 degrees and sunny.";
}
}, {
name: "get_weather",
description: "Call to get the current weather.",
schema: z.object({
location: z.string().describe("Location to get the weather for."),
})
})
const graph = createReactAgent({ llm: model, tools: [getWeather] });
import { ChatOpenAI } from "@langchain/openai"; import { tool } from '@langchain/core/tools'; import { z } from 'zod'; import { createReactAgent } from "@langchain/langgraph/prebuilt"; const model = new ChatOpenAI({ model: "gpt-4o", }); const getWeather = tool((input) => { if (["sf", "san francisco", "san francisco, ca"].includes(input.location.toLowerCase())) { return "It's 60 degrees and foggy."; } else { return "It's 90 degrees and sunny."; } }, { name: "get_weather", description: "Call to get the current weather.", schema: z.object({ location: z.string().describe("Location to get the weather for."), }) }) const graph = createReactAgent({ llm: model, tools: [getWeather] });
多流¶
要获取多种类型的流式片段,请在 .stream()
的第二个参数的 streamMode
密钥下传递一个值数组
在 [2]
已复制!
let inputs = { messages: [{ role: "user", content: "what's the weather in sf?" }] };
let stream = await graph.stream(inputs, {
streamMode: ["updates", "debug"],
});
for await (const chunk of stream) {
console.log(`Receiving new event of type: ${chunk[0]}`);
console.log(chunk[1]);
console.log("\n====\n");
}
let inputs = { messages: [{ role: "user", content: "what's the weather in sf?" }] }; let stream = await graph.stream(inputs, { streamMode: ["updates", "debug"], }); for await (const chunk of stream) { console.log(`Receiving new event of type: ${chunk[0]}`); console.log(chunk[1]); console.log("\n====\n"); }
Receiving new event of type: debug { type: 'task', timestamp: '2024-08-30T20:58:58.404Z', step: 1, payload: { id: '768110dd-6004-59f3-8671-6ca699cccd71', name: 'agent', input: { messages: [Array] }, triggers: [ 'start:agent' ], interrupts: [] } } ==== Receiving new event of type: updates { agent: { messages: [ AIMessage { "id": "chatcmpl-A22zqTwumhtW8TMjQ1FxlzCEMBk0R", "content": "", "additional_kwargs": { "tool_calls": [ { "id": "call_HAfilebE1q9E9OQHOlL3JYHP", "type": "function", "function": "[Object]" } ] }, "response_metadata": { "tokenUsage": { "completionTokens": 15, "promptTokens": 59, "totalTokens": 74 }, "finish_reason": "tool_calls", "system_fingerprint": "fp_157b3831f5" }, "tool_calls": [ { "name": "get_weather", "args": { "location": "San Francisco" }, "type": "tool_call", "id": "call_HAfilebE1q9E9OQHOlL3JYHP" } ], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 59, "output_tokens": 15, "total_tokens": 74 } } ] } } ==== Receiving new event of type: debug { type: 'task_result', timestamp: '2024-08-30T20:58:59.072Z', step: 1, payload: { id: '768110dd-6004-59f3-8671-6ca699cccd71', name: 'agent', result: [ [Array] ] } } ==== Receiving new event of type: debug { type: 'task', timestamp: '2024-08-30T20:58:59.074Z', step: 2, payload: { id: '76459c18-5621-5893-9b93-13bc1db3ba6d', name: 'tools', input: { messages: [Array] }, triggers: [ 'branch:agent:shouldContinue:tools' ], interrupts: [] } } ==== Receiving new event of type: updates { tools: { messages: [ ToolMessage { "content": "It's 60 degrees and foggy.", "name": "get_weather", "additional_kwargs": {}, "response_metadata": {}, "tool_call_id": "call_HAfilebE1q9E9OQHOlL3JYHP" } ] } } ==== Receiving new event of type: debug { type: 'task_result', timestamp: '2024-08-30T20:58:59.076Z', step: 2, payload: { id: '76459c18-5621-5893-9b93-13bc1db3ba6d', name: 'tools', result: [ [Array] ] } } ==== Receiving new event of type: debug { type: 'task', timestamp: '2024-08-30T20:58:59.077Z', step: 3, payload: { id: '565d8a53-1057-5d83-bda8-ba3fada24b70', name: 'agent', input: { messages: [Array] }, triggers: [ 'tools' ], interrupts: [] } } ==== Receiving new event of type: updates { agent: { messages: [ AIMessage { "id": "chatcmpl-A22zrdeobsBzkiES0C6Twh3p7I344", "content": "The weather in San Francisco right now is 60 degrees and foggy.", "additional_kwargs": {}, "response_metadata": { "tokenUsage": { "completionTokens": 16, "promptTokens": 90, "totalTokens": 106 }, "finish_reason": "stop", "system_fingerprint": "fp_157b3831f5" }, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": { "input_tokens": 90, "output_tokens": 16, "total_tokens": 106 } } ] } } ==== Receiving new event of type: debug { type: 'task_result', timestamp: '2024-08-30T20:58:59.640Z', step: 3, payload: { id: '565d8a53-1057-5d83-bda8-ba3fada24b70', name: 'agent', result: [ [Array] ] } } ====