如何使用 ToolNode 调用工具¶
本指南介绍了如何使用 LangGraph 预构建的 ToolNode
进行工具调用。
ToolNode
是一个 LangChain Runnable,它以图状态(包含消息列表)作为输入,并输出包含工具调用结果的状态更新。它被设计为与 LangGraph 预构建的 ReAct 代理 开箱即用,但也可以与任何 StateGraph
一起使用,只要其状态具有带有适当 reducer 的 messages
键(请参阅 MessagesState
)。
设置¶
首先,让我们安装所需的软件包并设置我们的 API 密钥
import getpass
import os
def _set_env(var: str):
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"{var}: ")
_set_env("ANTHROPIC_API_KEY")
设置 LangSmith 用于 LangGraph 开发
注册 LangSmith 以快速发现问题并提高 LangGraph 项目的性能。LangSmith 允许您使用跟踪数据来调试、测试和监控使用 LangGraph 构建的 LLM 应用程序 — 在此处阅读有关如何开始使用的更多信息。
定义工具¶
from langchain_core.messages import AIMessage
from langchain_core.tools import tool
from langgraph.prebuilt import ToolNode
API 参考: AIMessage | tool | ToolNode
@tool
def get_weather(location: str):
"""Call to get the current weather."""
if location.lower() in ["sf", "san francisco"]:
return "It's 60 degrees and foggy."
else:
return "It's 90 degrees and sunny."
@tool
def get_coolest_cities():
"""Get a list of coolest cities"""
return "nyc, sf"
手动调用 ToolNode
¶
ToolNode
在具有消息列表的图状态上运行。它期望列表中的最后一条消息是带有 tool_calls
参数的 AIMessage
。
让我们首先看看如何手动调用工具节点
message_with_single_tool_call = AIMessage(
content="",
tool_calls=[
{
"name": "get_weather",
"args": {"location": "sf"},
"id": "tool_call_id",
"type": "tool_call",
}
],
)
tool_node.invoke({"messages": [message_with_single_tool_call]})
{'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='tool_call_id')]}
请注意,通常您不需要手动创建 AIMessage
,它将由任何支持工具调用的 LangChain 聊天模型自动生成。
如果您将多个工具调用传递给 AIMessage
的 tool_calls
参数,您还可以使用 ToolNode
进行并行工具调用
message_with_multiple_tool_calls = AIMessage(
content="",
tool_calls=[
{
"name": "get_coolest_cities",
"args": {},
"id": "tool_call_id_1",
"type": "tool_call",
},
{
"name": "get_weather",
"args": {"location": "sf"},
"id": "tool_call_id_2",
"type": "tool_call",
},
],
)
tool_node.invoke({"messages": [message_with_multiple_tool_calls]})
{'messages': [ToolMessage(content='nyc, sf', name='get_coolest_cities', tool_call_id='tool_call_id_1'),
ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='tool_call_id_2')]}
与聊天模型一起使用¶
在我们的示例中,我们将使用来自 Anthropic 的小型聊天模型。要将聊天模型与工具调用一起使用,我们需要首先确保模型知道可用的工具。我们通过在 ChatAnthropic
模型上调用 .bind_tools
方法来实现这一点
from typing import Literal
from langchain_anthropic import ChatAnthropic
from langgraph.graph import StateGraph, MessagesState
from langgraph.prebuilt import ToolNode
model_with_tools = ChatAnthropic(
model="claude-3-haiku-20240307", temperature=0
).bind_tools(tools)
API 参考: ChatAnthropic | StateGraph | ToolNode
[{'name': 'get_weather',
'args': {'location': 'San Francisco'},
'id': 'toolu_01Fwm7dg1mcJU43Fkx2pqgm8',
'type': 'tool_call'}]
正如您所看到的,聊天模型生成的 AI 消息已经填充了 tool_calls
,因此我们可以直接将其传递给 ToolNode
{'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='toolu_01LFvAVT3xJMeZS6kbWwBGZK')]}
ReAct Agent¶
接下来,让我们看看如何在 LangGraph 图中使用 ToolNode
。让我们设置 ReAct 代理的图实现 ReAct agent。此代理将一些查询作为输入,然后重复调用工具,直到它有足够的信息来解决查询。我们将使用 ToolNode
和我们刚刚定义的带有工具的 Anthropic 模型
from typing import Literal
from langgraph.graph import StateGraph, MessagesState, START, END
def should_continue(state: MessagesState):
messages = state["messages"]
last_message = messages[-1]
if last_message.tool_calls:
return "tools"
return END
def call_model(state: MessagesState):
messages = state["messages"]
response = model_with_tools.invoke(messages)
return {"messages": [response]}
workflow = StateGraph(MessagesState)
# Define the two nodes we will cycle between
workflow.add_node("agent", call_model)
workflow.add_node("tools", tool_node)
workflow.add_edge(START, "agent")
workflow.add_conditional_edges("agent", should_continue, ["tools", END])
workflow.add_edge("tools", "agent")
app = workflow.compile()
API 参考: StateGraph | START | END
from IPython.display import Image, display
try:
display(Image(app.get_graph().draw_mermaid_png()))
except Exception:
# This requires some extra dependencies and is optional
pass
让我们试一下!
# example with a single tool call
for chunk in app.stream(
{"messages": [("human", "what's the weather in sf?")]}, stream_mode="values"
):
chunk["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
what's the weather in sf?
==================================[1m Ai Message [0m==================================
[{'text': "Okay, let's check the weather in San Francisco:", 'type': 'text'}, {'id': 'toolu_01LdmBXYeccWKdPrhZSwFCDX', 'input': {'location': 'San Francisco'}, 'name': 'get_weather', 'type': 'tool_use'}]
Tool Calls:
get_weather (toolu_01LdmBXYeccWKdPrhZSwFCDX)
Call ID: toolu_01LdmBXYeccWKdPrhZSwFCDX
Args:
location: San Francisco
=================================[1m Tool Message [0m=================================
Name: get_weather
It's 60 degrees and foggy.
==================================[1m Ai Message [0m==================================
The weather in San Francisco is currently 60 degrees with foggy conditions.
# example with a multiple tool calls in succession
for chunk in app.stream(
{"messages": [("human", "what's the weather in the coolest cities?")]},
stream_mode="values",
):
chunk["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
what's the weather in the coolest cities?
==================================[1m Ai Message [0m==================================
[{'text': "Okay, let's find out the weather in the coolest cities:", 'type': 'text'}, {'id': 'toolu_01LFZUWTccyveBdaSAisMi95', 'input': {}, 'name': 'get_coolest_cities', 'type': 'tool_use'}]
Tool Calls:
get_coolest_cities (toolu_01LFZUWTccyveBdaSAisMi95)
Call ID: toolu_01LFZUWTccyveBdaSAisMi95
Args:
=================================[1m Tool Message [0m=================================
Name: get_coolest_cities
nyc, sf
==================================[1m Ai Message [0m==================================
[{'text': "Now let's get the weather for those cities:", 'type': 'text'}, {'id': 'toolu_01RHPQBhT1u6eDnPqqkGUpsV', 'input': {'location': 'nyc'}, 'name': 'get_weather', 'type': 'tool_use'}]
Tool Calls:
get_weather (toolu_01RHPQBhT1u6eDnPqqkGUpsV)
Call ID: toolu_01RHPQBhT1u6eDnPqqkGUpsV
Args:
location: nyc
=================================[1m Tool Message [0m=================================
Name: get_weather
It's 90 degrees and sunny.
==================================[1m Ai Message [0m==================================
[{'id': 'toolu_01W5sFGF8PfgYzdY4CqT5c6e', 'input': {'location': 'sf'}, 'name': 'get_weather', 'type': 'tool_use'}]
Tool Calls:
get_weather (toolu_01W5sFGF8PfgYzdY4CqT5c6e)
Call ID: toolu_01W5sFGF8PfgYzdY4CqT5c6e
Args:
location: sf
=================================[1m Tool Message [0m=================================
Name: get_weather
It's 60 degrees and foggy.
==================================[1m Ai Message [0m==================================
Based on the results, it looks like the weather in the coolest cities is:
- New York City: 90 degrees and sunny
- San Francisco: 60 degrees and foggy
So the weather in the coolest cities is a mix of warm and cool temperatures, with some sunny and some foggy conditions.
ToolNode
还可以处理工具执行期间的错误。您可以通过设置 handle_tool_errors=True
(默认启用)来启用/禁用此功能。请参阅我们的指南,了解如何处理 ToolNode
中的错误 此处