如何将运行时值传递给工具¶
有时,您可能希望让一个调用工具的 LLM 填充工具函数参数的子集,并在运行时为其他参数提供其他值。如果您使用 LangChain 风格的 工具,处理此问题的简单方法是使用 InjectedArg 对函数参数进行标注。此标注将该参数排除在向 LLM 显示之外。
在 LangGraph 应用程序中,您可能希望在运行时将图状态或 共享内存(存储)传递给工具。这种类型的有状态工具在工具的输出受过去代理步骤影响(例如,如果您使用子代理作为工具,并且想要将消息历史传递给子代理)或工具的输入需要根据过去代理步骤的上下文进行验证时很有用。
在本指南中,我们将演示如何使用 LangGraph 的预构建 ToolNode 执行此操作。
先决条件
本指南针对 **LangChain 工具调用**,假设您熟悉以下内容:
您仍然可以使用您的提供程序 SDK 在 LangGraph 中使用工具调用,而不会丢失 LangGraph 的任何核心功能。以下示例使用的核心技术是将参数标注为“注入”,这意味着它将由您的程序注入,并且不应该被 LLM 看到或填充。请以下面的代码片段作为摘要:
from typing import Annotated
from langchain_core.runnables import RunnableConfig
from langchain_core.tools import InjectedToolArg
from langgraph.store.base import BaseStore
from langgraph.prebuilt import InjectedState, InjectedStore
# Can be sync or async; @tool decorator not required
async def my_tool(
# These arguments are populated by the LLM
some_arg: str,
another_arg: float,
# The config: RunnableConfig is always available in LangChain calls
# This is not exposed to the LLM
config: RunnableConfig,
# The following three are specific to the prebuilt ToolNode
# (and `create_react_agent` by extension). If you are invoking the
# tool on its own (in your own node), then you would need to provide these yourself.
store: Annotated[BaseStore, InjectedStore],
# This passes in the full state.
state: Annotated[State, InjectedState],
# You can also inject single fields from your state if you
messages: Annotated[list, InjectedState("messages")]
# The following is not compatible with create_react_agent or ToolNode
# You can also exclude other arguments from being shown to the model.
# These must be provided manually and are useful if you call the tools/functions in your own node
# some_other_arg=Annotated["MyPrivateClass", InjectedToolArg],
):
"""Call my_tool to have an impact on the real world.
Args:
some_arg: a very important argument
another_arg: another argument the LLM will provide
""" # The docstring becomes the description for your tool and is passed to the model
print(some_arg, another_arg, config, store, state, messages)
# Config, some_other_rag, store, and state are all "hidden" from
# LangChain models when passed to bind_tools or with_structured_output
return "... some response"
设置¶
首先,我们需要安装所需的软件包
%%capture --no-stderr
%pip install --quiet -U langgraph langchain-openai
接下来,我们需要为 OpenAI(我们将使用的聊天模型)设置 API 密钥。
import getpass
import os
def _set_env(var: str):
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"{var}: ")
_set_env("OPENAI_API_KEY")
OPENAI_API_KEY: ········
将图状态传递给工具¶
首先,让我们看看如何让我们的工具访问图状态。我们需要定义我们的图状态
from typing import List
# this is the state schema used by the prebuilt create_react_agent we'll be using below
from langgraph.prebuilt.chat_agent_executor import AgentState
from langchain_core.documents import Document
class State(AgentState):
docs: List[str]
定义工具¶
我们希望我们的工具将图状态作为输入,但我们不希望模型在调用工具时尝试生成此输入。我们可以使用 InjectedState
标注将参数标记为必需的图状态(或图状态的某个字段)。这些参数将不会由模型生成。使用 ToolNode
时,图状态将自动传递到相关的工具和参数中。
在此示例中,我们将创建一个返回 Documents 的工具,然后创建一个实际引用证明断言的 Documents 的工具。
在 LangChain 中使用 Pydantic
此笔记本使用 Pydantic v2 BaseModel
,这需要 langchain-core >= 0.3
。使用 langchain-core < 0.3
将导致错误,因为混合了 Pydantic v1 和 v2 BaseModels
。
from typing import List, Tuple
from typing_extensions import Annotated
from langchain_core.messages import ToolMessage
from langchain_core.tools import tool
from langgraph.prebuilt import InjectedState
@tool
def get_context(question: str, state: Annotated[dict, InjectedState]):
"""Get relevant context for answering the question."""
return "\n\n".join(doc for doc in state["docs"])
如果我们查看这些工具的输入模式,我们会看到 state
仍然列出
get_context.get_input_schema().schema()
{'description': 'Get relevant context for answering the question.', 'properties': {'question': {'title': 'Question', 'type': 'string'}, 'state': {'title': 'State', 'type': 'object'}}, 'required': ['question', 'state'], 'title': 'get_context', 'type': 'object'}
但是,如果我们查看工具调用模式(即传递给模型以进行工具调用的模式),state
已被移除
get_context.tool_call_schema.schema()
{'description': 'Get relevant context for answering the question.', 'properties': {'question': {'title': 'Question', 'type': 'string'}}, 'required': ['question'], 'title': 'get_context', 'type': 'object'}
定义图¶
在此示例中,我们将使用 预构建 ReAct 代理。我们首先需要定义我们的模型和一个工具调用节点 (ToolNode)
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import ToolNode, create_react_agent
from langgraph.checkpoint.memory import MemorySaver
model = ChatOpenAI(model="gpt-4o", temperature=0)
tools = [get_context]
# ToolNode will automatically take care of injecting state into tools
tool_node = ToolNode(tools)
checkpointer = MemorySaver()
graph = create_react_agent(model, tools, state_schema=State, checkpointer=checkpointer)
使用它!¶
docs = [
"FooBar company just raised 1 Billion dollars!",
"FooBar company was founded in 2019",
]
inputs = {
"messages": [{"type": "user", "content": "what's the latest news about FooBar"}],
"docs": docs,
}
config = {"configurable": {"thread_id": "1"}}
for chunk in graph.stream(inputs, config, stream_mode="values"):
chunk["messages"][-1].pretty_print()
================================ Human Message ================================= what's the latest news about FooBar ================================== Ai Message ================================== Tool Calls: get_context (call_UkqfR7z2cLJQjhatUpDeEa5H) Call ID: call_UkqfR7z2cLJQjhatUpDeEa5H Args: question: latest news about FooBar ================================= Tool Message ================================= Name: get_context FooBar company just raised 1 Billion dollars! FooBar company was founded in 2019 ================================== Ai Message ================================== The latest news about FooBar is that the company has just raised 1 billion dollars.
注意
Store
API 和 InjectedStore
的支持是在 LangGraph v0.2.34
中添加的。InjectedStore
标注需要 langchain-core >= 0.3.8
from langgraph.store.memory import InMemoryStore
doc_store = InMemoryStore()
namespace = ("documents", "1") # user ID
doc_store.put(
namespace, "doc_0", {"doc": "FooBar company just raised 1 Billion dollars!"}
)
namespace = ("documents", "2") # user ID
doc_store.put(namespace, "doc_1", {"doc": "FooBar company was founded in 2019"})
定义工具¶
from langgraph.store.base import BaseStore
from langchain_core.runnables import RunnableConfig
from langgraph.prebuilt import InjectedStore
@tool
def get_context(
question: str,
config: RunnableConfig,
store: Annotated[BaseStore, InjectedStore()],
) -> Tuple[str, List[Document]]:
"""Get relevant context for answering the question."""
user_id = config.get("configurable", {}).get("user_id")
docs = [item.value["doc"] for item in store.search(("documents", user_id))]
return "\n\n".join(doc for doc in docs)
我们还可以验证工具调用模型是否会忽略 get_context
工具的 store
参数
get_context.tool_call_schema.schema()
{'description': 'Get relevant context for answering the question.', 'properties': {'question': {'title': 'Question', 'type': 'string'}}, 'required': ['question'], 'title': 'get_context', 'type': 'object'}
定义图¶
让我们更新我们的 ReAct 代理
tools = [get_context]
# ToolNode will automatically take care of injecting Store into tools
tool_node = ToolNode(tools)
checkpointer = MemorySaver()
# NOTE: we need to pass our store to `create_react_agent` to make sure our graph is aware of it
graph = create_react_agent(model, tools, checkpointer=checkpointer, store=doc_store)
使用它!¶
让我们尝试使用配置中的 "user_id"
运行我们的图。
messages = [{"type": "user", "content": "what's the latest news about FooBar"}]
config = {"configurable": {"thread_id": "1", "user_id": "1"}}
for chunk in graph.stream({"messages": messages}, config, stream_mode="values"):
chunk["messages"][-1].pretty_print()
================================ Human Message ================================= what's the latest news about FooBar ================================== Ai Message ================================== Tool Calls: get_context (call_ocyHBpGgF3LPFOgRKURBfkGG) Call ID: call_ocyHBpGgF3LPFOgRKURBfkGG Args: question: latest news about FooBar ================================= Tool Message ================================= Name: get_context FooBar company just raised 1 Billion dollars! ================================== Ai Message ================================== The latest news about FooBar is that the company has just raised 1 billion dollars.
我们可以看到,该工具在存储中查找信息时只检索了用户“1”的正确文档。现在让我们尝试对另一个用户再次执行此操作
messages = [{"type": "user", "content": "what's the latest news about FooBar"}]
config = {"configurable": {"thread_id": "2", "user_id": "2"}}
for chunk in graph.stream({"messages": messages}, config, stream_mode="values"):
chunk["messages"][-1].pretty_print()
================================ Human Message ================================= what's the latest news about FooBar ================================== Ai Message ================================== Tool Calls: get_context (call_zxO9KVlL8UxFQUMb8ETeHNvs) Call ID: call_zxO9KVlL8UxFQUMb8ETeHNvs Args: question: latest news about FooBar ================================= Tool Message ================================= Name: get_context FooBar company was founded in 2019 ================================== Ai Message ================================== FooBar company was founded in 2019. If you need more specific or recent news, please let me know!
我们可以看到,该工具这次拉取了一个不同的文档。