{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# How to call tools using ToolNode\n", "\n", "This guide covers how to use LangGraph's prebuilt [`ToolNode`](https://langchain-ai.github.io/langgraph/reference/prebuilt/#toolnode) for tool calling.\n", "\n", "`ToolNode` is a LangChain Runnable that takes graph state (with a list of messages) as input and outputs state update with the result of tool calls. It is designed to work well out-of-box with LangGraph's prebuilt [ReAct agent](https://langchain-ai.github.io/langgraph/how-tos/create-react-agent/), but can also work with any `StateGraph` as long as its state has a `messages` key with an appropriate reducer (see [`MessagesState`](https://github.com/langchain-ai/langgraph/blob/e3ef9adac7395e5c0943c22bbc8a4a856b103aa3/libs/langgraph/langgraph/graph/message.py#L150))." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Setup\n", "\n", "First, let's install the required packages and set our API keys" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "%%capture --no-stderr\n", "%pip install --quiet -U langgraph langchain_anthropic" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import getpass\n", "import os\n", "\n", "\n", "def _set_env(var: str):\n", " if not os.environ.get(var):\n", " os.environ[var] = getpass.getpass(f\"{var}: \")\n", "\n", "\n", "_set_env(\"ANTHROPIC_API_KEY\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
\n", "

Set up LangSmith for LangGraph development

\n", "

\n", " Sign up for LangSmith to quickly spot issues and improve the performance of your LangGraph projects. LangSmith lets you use trace data to debug, test, and monitor your LLM apps built with LangGraph — read more about how to get started here. \n", "

\n", "
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Define tools" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "from langchain_core.messages import AIMessage\n", "from langchain_core.tools import tool\n", "\n", "from langgraph.prebuilt import ToolNode" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "@tool\n", "def get_weather(location: str):\n", " \"\"\"Call to get the current weather.\"\"\"\n", " if location.lower() in [\"sf\", \"san francisco\"]:\n", " return \"It's 60 degrees and foggy.\"\n", " else:\n", " return \"It's 90 degrees and sunny.\"\n", "\n", "\n", "@tool\n", "def get_coolest_cities():\n", " \"\"\"Get a list of coolest cities\"\"\"\n", " return \"nyc, sf\"" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "tools = [get_weather, get_coolest_cities]\n", "tool_node = ToolNode(tools)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Manually call `ToolNode`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`ToolNode` operates on graph state with a list of messages. It expects the last message in the list to be an `AIMessage` with `tool_calls` parameter. \n", "\n", "Let's first see how to invoke the tool node manually:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'messages': [ToolMessage(content=\"It's 60 degrees and foggy.\", name='get_weather', tool_call_id='tool_call_id')]}" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "message_with_single_tool_call = AIMessage(\n", " content=\"\",\n", " tool_calls=[\n", " {\n", " \"name\": \"get_weather\",\n", " \"args\": {\"location\": \"sf\"},\n", " \"id\": \"tool_call_id\",\n", " \"type\": \"tool_call\",\n", " }\n", " ],\n", ")\n", "\n", "tool_node.invoke({\"messages\": [message_with_single_tool_call]})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note that typically you don't need to create `AIMessage` manually, and it will be automatically generated by any LangChain chat model that supports tool calling.\n", "\n", "You can also do parallel tool calling using `ToolNode` if you pass multiple tool calls to `AIMessage`'s `tool_calls` parameter:" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'messages': [ToolMessage(content='nyc, sf', name='get_coolest_cities', tool_call_id='tool_call_id_1'),\n", " ToolMessage(content=\"It's 60 degrees and foggy.\", name='get_weather', tool_call_id='tool_call_id_2')]}" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "message_with_multiple_tool_calls = AIMessage(\n", " content=\"\",\n", " tool_calls=[\n", " {\n", " \"name\": \"get_coolest_cities\",\n", " \"args\": {},\n", " \"id\": \"tool_call_id_1\",\n", " \"type\": \"tool_call\",\n", " },\n", " {\n", " \"name\": \"get_weather\",\n", " \"args\": {\"location\": \"sf\"},\n", " \"id\": \"tool_call_id_2\",\n", " \"type\": \"tool_call\",\n", " },\n", " ],\n", ")\n", "\n", "tool_node.invoke({\"messages\": [message_with_multiple_tool_calls]})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using with chat models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We'll be using a small chat model from Anthropic in our example. To use chat models with tool calling, we need to first ensure that the model is aware of the available tools. We do this by calling `.bind_tools` method on `ChatAnthropic` moodel" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "from typing import Literal\n", "\n", "from langchain_anthropic import ChatAnthropic\n", "from langgraph.graph import StateGraph, MessagesState\n", "from langgraph.prebuilt import ToolNode\n", "\n", "\n", "model_with_tools = ChatAnthropic(\n", " model=\"claude-3-haiku-20240307\", temperature=0\n", ").bind_tools(tools)" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "[{'name': 'get_weather',\n", " 'args': {'location': 'San Francisco'},\n", " 'id': 'toolu_01Fwm7dg1mcJU43Fkx2pqgm8',\n", " 'type': 'tool_call'}]" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "model_with_tools.invoke(\"what's the weather in sf?\").tool_calls" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As you can see, the AI message generated by the chat model already has `tool_calls` populated, so we can just pass it directly to `ToolNode`" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'messages': [ToolMessage(content=\"It's 60 degrees and foggy.\", name='get_weather', tool_call_id='toolu_01LFvAVT3xJMeZS6kbWwBGZK')]}" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "tool_node.invoke({\"messages\": [model_with_tools.invoke(\"what's the weather in sf?\")]})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ReAct Agent" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "Next, let's see how to use `ToolNode` inside a LangGraph graph. Let's set up a graph implementation of the [ReAct agent](https://langchain-ai.github.io/langgraph/concepts/agentic_concepts/#react-agent). This agent takes some query as input, then repeatedly call tools until it has enough information to resolve the query. We'll be using `ToolNode` and the Anthropic model with tools we just defined" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "from typing import Literal\n", "\n", "from langgraph.graph import StateGraph, MessagesState, START, END\n", "\n", "\n", "def should_continue(state: MessagesState):\n", " messages = state[\"messages\"]\n", " last_message = messages[-1]\n", " if last_message.tool_calls:\n", " return \"tools\"\n", " return END\n", "\n", "\n", "def call_model(state: MessagesState):\n", " messages = state[\"messages\"]\n", " response = model_with_tools.invoke(messages)\n", " return {\"messages\": [response]}\n", "\n", "\n", "workflow = StateGraph(MessagesState)\n", "\n", "# Define the two nodes we will cycle between\n", "workflow.add_node(\"agent\", call_model)\n", "workflow.add_node(\"tools\", tool_node)\n", "\n", "workflow.add_edge(START, \"agent\")\n", "workflow.add_conditional_edges(\"agent\", should_continue, [\"tools\", END])\n", "workflow.add_edge(\"tools\", \"agent\")\n", "\n", "app = workflow.compile()" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "image/jpeg": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAD5ANYDASIAAhEBAxEB/8QAHQABAAICAwEBAAAAAAAAAAAAAAYHAwUCBAgBCf/EAFIQAAEEAQIDAgUOCQkGBwAAAAEAAgMEBQYRBxIhEzEWFyJBlAgUFTJRVVZhcXSy0dLTIzY3QlSBkZOVGDVDUnWCkrO0JCUncpahMzRTZLHB8P/EABsBAQEAAwEBAQAAAAAAAAAAAAABAgMFBAYH/8QAMxEBAAECAQkFCAIDAAAAAAAAAAECEQMEEiExQVFSkdEUM2FxoQUTFSNiscHhgZIi8PH/2gAMAwEAAhEDEQA/AP1TREQEREBERAWG1cr0o+exPHXZ/WleGj9pWju37uevz47FTGlVrnkt5NrQ5zX/APpQhwLS4d7nuBa3cNAc4u5Ptbh/p+F5llxcF+ydua1fb65mcR5y9+5/Z0W+KKae8n+IW293fCrC++9D0ln1p4VYX34oeks+tPBXC+89D0Zn1J4K4X3noejM+pX5Pj6LoPCrC+/FD0ln1p4VYX34oeks+tPBXC+89D0Zn1J4K4X3noejM+pPk+PoaDwqwvvxQ9JZ9aeFWF9+KHpLPrTwVwvvPQ9GZ9SeCuF956HozPqT5Pj6Gg8KsL78UPSWfWu5UyFW+0uq2YbLR3mGQOA/Yun4K4X3noejM+pdS1oHTluQSuw1OGdp3bYrRCGZp+KRmzh+op8mds+n6TQ36KMR2bmkZ4Yb9qbJYeVwjZen5e1quJ2a2UgAOYegD9twdubfcuEnWuujN8YJgREWtBERAREQEREBERAREQEREBajV2Yfp/S+VyMQDpq1Z8kTXdxft5IP69lt1HuIVOW9onMxwtMkza7pWMaNy5zPLAA90luy24MROJTFWq8LGtsNP4ePAYapQjPN2LPLk88khO73n43OLnE+6StisNO1FeqQWYHc8MzGyMd7rSNwf2FZlhVMzVM1a0FEuIHFbS3C6LHv1JkzSfkJHRVIIa01madzW8z+SKFj3kNHUnbYbjchS1Up6pWhUfBp3Jx4/WDdSY59mTEZzR2ON2ahK6NocyaIBwdHL0Ba5paeXqW9CsR2cp6pjT+N4q6b0m2tetUc3hfZeHJ1cdbnB55IWwtDY4XeS5sjnOkJAZs0O5S4KQWuP2gqOuW6Qs571vnX2m0WxS052wmw4bthE5j7LtDuNm8+53A2VUx5fWendd8Ltfax0nlrtuxpGzicxDp6g+4+neklrTDnij3LWu7J43G4aehPnUA4t4/Wep5tTDMYbX+W1Bj9VwW8fUxsEwwsOJguRSRyRtjIjsSGJpJGz5ec9GgDoHpi3x20TT1je0ocpYsahozR17VCnjbVh8DpI2yMLzHE4NYWvb5ZPLuSN9wQNXwF4943jngrNyrRu465XsWY5K89KyyMRssSRRubNJExj3OawOcxpJYSWuAIXW4S6fu4zjFxpyVrG2KkGSy2PdVtzQOY21GzHQNJY4jZ7Wv529NwDzDv3Wr9THYyGl8PlNCZjT2axuSxeUylr19YovbQswy3pJY3Q2NuR5c2Zp5Qdxyu3A2QXgiIg6+QoV8rQs0rcTZ6tmN0MsT+57HDZwPyglajQ1+e/puEWpe3t1JZqM0p33kfDK6IvO/9bk5v1rfqM8PG9pp+S4N+S/dtXI+YbbxyTvdGdvjZyn9a9FPc1X3x+V2JMiIvOgiIgIiICIiAiIgIiICIiAiIgilOdmg3mjb2iwDnl1O315Km53MMp7mN3J5H9G7bMOxDe0x6r4RaG1/kY8lqPSWEz95sQhZayFGKeQRgkhoc4E8u7nHb4ypa9jZGOY9oexw2LXDcEe4VGn8PsdCScbZyGFB/osdbfHEPc2iO8bf1NH/YL0TVRiaa5tPO/wDv8stEo8fU28KC0N8W+luUEkD2Jg2B8/5vxBSbR/DvS3D2GzFpjT2M0/FZc107MbUZAJSNwC4NA323Pf7qw+BNj4VZ799D90ngTY+FWe/fQ/dJ7vD4/SUtG9KEUX8CbHwqz376H7pRO9jstX4q4PTzNU5j2OuYW/flJlh7TtYZ6bGbfg/a8tiTfp38vUed7vD4/SS0b1qLS6s0XgNd4xuO1HhaGdx7ZBM2rka7Z4w8AgO5XAjcBxG/xldHwJsfCrPfvofuk8CbHwqz376H7pPd4fH6SWje0DfU3cKWBwbw40u0PGzgMTB1G4Ox8n3QP2LZ6Z4K6A0Zl4srgNF4HDZOIObHco4+KGVocNnAOa0EbgkFdzwJsfCrPfvoful98AKdh3+8MhlcqzffsbV14iPysZytcPicCEzMONdfKP8AhaHHK5Dwu7fDYqXnqP5ochkYXeRCzqHRRuHfKe7p7QbuJB5WuksEEdaCOGFjYoo2hjGMGwa0DYADzBfKtWGlXjr14Y68EbQ1kUTQ1rQO4ADoAsqwrriYzadUEiIi1IIiICIiAiIgIiICIiAiIgIiICIiAiIgKvssW+P7SwJPN4MZfYebb11jd/P8nm/WPPYKr/K7+P7S3Vu3gxl+hA3/APNY3u8+3ydO7fzILAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQFXuWA/lA6VPM0HwXzHk7dT/ALXjOu+3d+vzj9VhKvctt/KC0r1PN4L5jYcv/u8Z5/8A9/2QWEiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIonf1ZkbVyxBg6NazFXkMMtu7O6JhkG4c1gaxxdykbE9ADuBuQdtuHh1Yk2pW10sRQj2d1h+gYP0ub7tPZ3WH6Bg/S5vu1v7LXvjnBZN14D1j6vbK6e9URXxNrhXO7UOJjuadGPizAd28s9is5r2O9b78p9bjbYeUHg+YL2L7O6w/QMH6XN92qgz3qf5tQ+qDw/Fqxj8MMzjqvYmoLEhinmaOWKdx7PfnY07D/lZ/V6uy1745wWelkUI9ndYfoGD9Lm+7T2d1h+gYP0ub7tOy1745wWTdFCPZ3WH6Bg/S5vu1li1flsW5kmdoU4qBcGvtUbD5OwJOwc9jmDyN9t3AnbfcjYFwk5LibLT/ADBZMkRF5EEREBERAREQEREBERAREQEREBERAVeaGO+BeT3m/eJ+M+upVYarzQv8wP8An13/AFUq9+T93V5x+V2JAiItiCIiAiLo2M5j6uXqYua7BHkrcckteo6QCWVjOXnc1veQ3mbufNzD3UHeUd4jnbh7qg9Nxi7RG43/AKJykSjnEj8neqf7Ktf5LluwO9o84+7KnXCxGe1HyLkuLPaN+RclxmIiIgIiICIiAiIgIiICIiAiIgIiICrzQv8AMD/n13/VSqw1Xmhf5gf8+u/6qVe/J+7q84/K7EgXkPiHrLUMOqb+t9KXNSMw2K1ZVw1qfI6gIpTO9dx1rEEOOEZa6Pdzm9o5zXhwLhuAvXirXOepw4dajyOSvZDTgnnyMxtWGtuWGRmc7bzsjbIGRzdP/FYGv7/K6lWqJnUig+Kua1DndU8QMQNRarp68hzFSrpvT2IsWIaU+NeIfwjhFs0hwNkvlc4FnJ0LdgDtci/iXxc1xxGOCuWKR0/lX4jHMg1XLi2U+SGNzJpKrKsrbAe55fvI4gjyQG8u5kHE/wBTzq3VeuM7k9PS4fADJyxyx52tm8tWu1ntjYwymrFIK80gDBsTyggNDgdtzZ2p+AGhda5o5jOYQXctJCyC1aiszV/XjWDZonZE9rZQPceHbDp3LDNmbiqW4jU+tNea+xeoNW5vGXcLpfEWew0/k5a1aO/JDZ7WVnLsS3ni6NOzXD2zSQNtHgqLuK3ELgJn83lMvDksroqzasyY7KT0w+ZgqOJAie0DmMji4Do4BoO4a3b0xDofCQZzN5iOly5HNVoad+btX/hoog8Rt5ebZuwlf1aATzdSdhtHstwH0Nm9OadwdvCE4/T0XY4oQ3LEU1WPkDC1szJBIQWgAguPNsN99llmyJ8o5xI/J3qn+yrX+S5SMDYAKOcSPyd6p/sq1/kuXqwO9o84+7KnXCxGe0b8i5Liz2jfkXJcZiIiICIiAiIgIiICIiAiIgIiICIiAq80L/MD/n13/VSqw1XczMhpjMzY7HYufO0rEs9thqPa19RznCSSKQyFrBu6YFg5g4tcQG7Rlx92TzGbVRe0zadOjVfqsarJCi0nstnvgZlfSqX36ey2e+BmV9Kpffr05n1R/aOq2btFpPZbPfAzK+lUvv1F7vGOtj+IWP0PYwd+LVWQqPu1scZ6vNJCzfmdzdtyjucdidyGkgbApmfVH9o6llhotJ7LZ74GZX0ql9+nstnvgZlfSqX36Zn1R/aOpZu1HOJH5O9U/wBlWv8AJcux7LZ74GZX0ql9+sWQx+e1Tj56EmElxVWaMtsOtWYjJIzY7xs7NzgHO9rzEgNDidiRsc8O2HXFdVUWib646kRabrBZ7RvyLktZhs/Xy7WRFrqWSFeKxYxdl7PXNVsnNyiRrHOA6se3mBLSWO5XHZbNcViIiICIiAiIgIiICIiAiIgIiICL45wY0ucQ1oG5J7gtDG+xqew2SOSaliIJz7URublIzF0IduS2Lmee7lc50QIPZn8IHGfIWdSiatiZZadMxwyszkXZSRSgyeXHCNyS7kad3lvKO0YW85Dg3bY3FU8PDJDRqxVIpJpLD2xMDQ6SR5fI87d7nOcST5ySs1atDSrRV68TIIImCOOKJoa1jQNg0AdAAOmyyoCIiAvzx4g+pl43Z71XVTWVbUWlaufnM2ZxcbrtoxQVKksEQgeRX84sRggAg7v3Pu/ocq/yHLNx8wHKGl1fTOR5zueZoktUeXp3bHsnf4flQWAiIgIiINbmcFBmIXDtZqVrZoZepuDJ4w17XgB2x8kuY3dpBa4dHAgkLpw5y5jrorZuGGIWrskNCxSEkkb4gznZ2/k7Qv6Pb1cWuLAQ4OkEY3y+OaHtLXAOaRsQe4oPqKMCrNoam31jBLa05SqNiZjasTprUJEnVzCXbvYI3H8GAXARAMDiQ1SSOVkrS5j2vaCW7tO43B2I/UQR+pBzREQEREBERAREQEREBEWK1P61rTTcj5ezYX8kY3c7Yb7AecoNBZEOsr1zHu5J8JUdJTyVK5j+eO690bHBjXv8l0bQ883K1wL9m8wMcjDJFodBx8mi8I7tcpMZKkcxfmz/ALbu9ocRMB0DxzbFo6AjYdAFvkBERAREQFX3DgnVeodQa435qOREWOxDt9w+jAXkTjrttLLLM4Ee2jbCfc256ltS8QsrY0pjJnR4iu8Mz+Qhc5ruXYO9ZROHdI8Edo4Hdkbths+RrmTqvXiqQRwQRshhiaGMjjaGtY0DYAAdwA8yDIiIgIiICIiAo9fqeC5tZShEGUS+W7kadeo+eaw7kA54g078/kAloa4v67DmO5kKIMdexHbrxTwvEkUrQ9jx3OaRuCsi0OBgmxeZy2O7C++kXNvQ3bdgTRudM+TtII9zzNDCwO5T0AlaGnYcrd8gIiICIiAiIgIi0uY1tp7T9oVsnnMdj7JHN2Nm0xj9vd5Sd9lnTRVXNqYvK2u3SKLeNLR3wpxHpsf1qM8S7/DbivoTM6Sz+o8VNispB2MoZfja9pBDmPad/bNe1rhv03aNwR0W3s+NwTylc2dzY6F4gaXhlqaMOpN9TUnS0his7kInZicQlw7Z8fNzvD42CVr9vKjc157yp8vzi9RTwXo8FfVE6vv6jzeLkx+Hpmticp65YIrhmcPwkZ323EbXBw72l+x+P3p40tHfCnEemx/WnZ8bgnlJmzuSlFFvGlo74U4j02P608aWjvhTiPTY/rTs+NwTykzZ3JSobns7kNQZeTTmm5ewkiLRlczy8zcewjfsotxyvsub3NO4ia4SPB3jjm1GS4jVdZ51ml9LZypA+WPnt5eKeNzoWEe0rNduJZj7uxZGOrtzysdOsHg6Gm8XDjsbWbVpw8xbG0kkuc4ue9zjuXOc5znOc4lznOJJJJK1VUVUTauLJaz5gcDQ0xiK2MxlcVqVcEMZzFxJJLnOc5xLnvc4lznuJc5ziSSSStgiLBBERAREQEREBERBHrVH/iDjbjcZPJ/uu1E/JNsbRQ/ha5bC6L85z/KcHfmiJw/OUhVMZT1QHCqHibh3y690xzwYvIQvu+EtVsNdxmp7wyR9p1kfyktcerRDIPzlc6AiIgIiICIiDpZq47H4e9aYAXwQSStB91rSR/8ACiOkqkdbAUpAOaezEyeeZ3V80jmgue4nqSSf1d3cFJ9VfixmPmc30Co9pr8XMV80i+gF0MDRhT5rsbJERZoIiICIiDq5LG1stTkrWoxJE/49i0jqHNI6tcDsQ4dQQCOq7+g8pPmtF4O9af2tmenE+WTbbndyjd23m3PXb41iWHhZ+TnTnzGL6KxxdODPhMfaei7EpREXOQREQERRvXWs4NFYgWHRizcnf2VWrzcvav7ySfM1o3JPuDYbkgHZh4dWLXFFEXmRucnlqOEqOt5G5XoVW+2ntStjYPlc4gKMS8YdHQvLTnIXEdN445Hj9oaQqPydq1ncj7IZWw6/e68skg8mIb+1jb3Mb0HQdTsCST1WNfW4XsPDin5tc38P3cvC8fHNo336b6PL9hPHNo336b6PL9hUci3fA8m4qucdC8KC4kep00nqn1Y2O1JXuRnh7kpPZjKuEUgbHYYd3wcu3N+FfynoNgHu9xe7vHNo336b6PL9hUcifA8m4qucdC8Lx8c2jffpvo8v2F9Zxk0a923s3G343wyNH7S1UaifA8m4qucdC8PS2H1BjNQ13T4vIVchE08rnVpWyBp9w7HofiK2C8sQGSlejvUp5KN+P2lquQ17fiPQhw6DyXAg7dQVevDfXw1jSmr22sgy9MNE8bPaytPdKweZpIII72kEdRsTxcu9l1ZLT7yib0+sLr1JkiIuEjV6q/FjMfM5voFR7TX4uYr5pF9AKQ6q/FjMfM5voFR7TX4uYr5pF9ALo4Pcz5/hdjvWHSMgkdCxsswaSxjncoc7boCdjt18+xXnbhbx61RjOCuY1nrzFRWK9S9bgqzY+6JrN2f2Qkrx1hD2MbWbO5I2u5jzAcxDeq9Grz3DwC1dLoHUugp8jhYsA6/Nl8DloTK65DZN4XImzxFoZyteXNJa8kjboFJvsRIG+qEn0tazNTiHpg6QtUMLLn4vWuQbkI7NaJwbK1rwxm0rXOYOTbY842cQsFfjfnZ7FXEan0dNo6bUGLt2sJZjybbTnvih7V0UoaxphlDDzgAuHku8rcLW5ngRqji5kM3e4i3MNRdPp2xp+hU086WaOHt3NdJZe+VrCXbxx7MA2AB3J713cdwo11q/VWmsjr+/gmVNNU7UNRmBMz33LE8Brunl7RrRGBGX7MbzdXnyugU/yGj0lxxzGmuGHBbGRYt2q9UarwjJmz5XLCoyR8UETpOad7Xl8rzINm7Eu2cSRsvQmPmns0K01msadmSJr5a5eH9k8gEs5h0Ox3G46HZefrHBbXzuCGB4e2KOhdRV8fUkx0kmV9ctHZsa1lWxHyscWTNAcXAefbleFdmg9P29KaJwGFv5KTMXsdQgqT5CbfnsvZGGukO5J3cQT1JPXqSrTfaN6sPCz8nOnPmMX0VmWHhZ+TnTnzGL6KuL3M+cfaV2JSiIucgiIgKguLOSdkuIliBziYsbVjgjae5rpPwjyPlHZA/8gV+qguLONdjOIc87mkRZOrHPG89znx/g3gfIOyP98Lvexc3tWnXaben4uuyUWRdfI34sXRntziUwwsL3iGF8r9h7jGAucfiAJUVHFvT5/os5/wBO5D7hfb1YlFGiqYhrTJzg1pJIAHUk+ZUnS9VBh7uQqPZBjzhLdtlSKdmagde8p/I2R1MeWGFxB9sXBp3LQp2zijp++9tXsc0e3PZ7P0/fY079OrjAAB17ydlHuH2hNXaDix+n2v0/e0zQkc2K9M2UX3V9yWsLAOTmG4HPzdw9ruvJiV111U+5q0bbWndb8qxT8br9eHKZKTSxbp7F5mTD3L/sg3tGltgQiVkXJ5Td3NJBc0jcgcwG56/EzihmJsPrmjpfCTXIMLRniu5pt8VjVnMBftCNiXvja5rjsW7HoDus+R4TZe3w61hgGWaQuZjOzZOu9z39m2J9tkwDzybh3K0jYAjfz+dYNQ8NNYV/DnH6cs4WTCaqE00gybpmTVbEsAikLeRpD2u5Wnrtsfd8+iqcozbTfTHhfb+hY+i55bWjsFNNI+aaShA98kji5znGNpJJPeSfOtwoLj9b4rRuMoYO+3KSXcfWhrTOp4W9PEXNjaCWyMhLXD4wVn8bunj/AEWd/wCnch9wvbTi4cRETVF/NEzW20VknYfXuAsscWiac0pQPz2StIA/xiN391RvC5qtn8dHdqCw2B5IAtVpa8nQ7HdkjWuHd5x1Uk0TjXZnXuArMbzNgnN2Uj8xkbSQf8ZjH95TKJonArmrVafsyp1vSCIi/MFavVX4sZj5nN9AqPaa/FzFfNIvoBSnM03ZHEXqjCA+eCSIE+YuaR/9qIaSuR2MDThB5LNaFkFiB3R8MjWgOY4HqCD+0bEdCF0MDThTHiuxuERFmgiIgIiICw8LPyc6c+YxfRWPJ5StiKj7NqURxt6Ad7nuPQNa0dXOJIAaNySQB1K2GhMXPhNGYSjaZ2dmCnEyWPffkfyjdu/n2PTf4lji6MGfGY+09V2N6iIucgiIgKOa50ZBrXDis+QVrcL+1q2uXmMT+7qOm7SNwRv3HoQQCJGi2YeJVhVxXRNpgeXcrUtafyHrDLVzj7nXla87slH9aN/c8d3d1G43DT0WNenMli6WZqPq36kF6s/20NmJsjD8rSCFGJeEGjpXFxwNdpPXaNz2D9gIC+twvbmHNPzaJv4fstCikV5eJvRvvHF+9k+0nib0b7xxfvZPtLd8cybhq5R1LQo1FeXib0b7xxfvZPtJ4m9G+8cX72T7SfHMm4auUdS0KNRXl4m9G+8cX72T7S+s4O6NY7f2Cgd8T3vcP2F2yfHMm4auUdS0b1F1hLkLzKNGCS/ff7WrXAc8/GeuzR1HlOIA36lXtw40ENG0Zp7T2T5e3ymeRntI2j2sTD3loJJ3PVxJOwGzWyLEYLG4CuYMZQrY+EncsrRNjDj7p2HU/GV31xMu9qVZXT7uiLU+srq1CIi4aC0uY0Vp/UNgWMpg8bkZwOUS2qkcjwPc3cCdlukWVNdVE3pm0mpFvFXoz4J4T+HxfZTxV6M+CeE/h8X2VKUW7tGNxzzlbzvRbxV6M+CeE/h8X2U8VejPgnhP4fF9lSlE7Rjcc85LzvRbxV6M+CeE/h8X2U8VejPgnhP4fF9lSlE7Rjcc85LzvaPFaG05grLbOOwGMoWG78s1apHG9u/fsQNxut4iLVVXVXN6pumsREWAIiICIiAiIgIiICIiAiIgIiICIiD/2Q==", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from IPython.display import Image, display\n", "\n", "try:\n", " display(Image(app.get_graph().draw_mermaid_png()))\n", "except Exception:\n", " # This requires some extra dependencies and is optional\n", " pass" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's try it out!" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "================================\u001b[1m Human Message \u001b[0m=================================\n", "\n", "what's the weather in sf?\n", "==================================\u001b[1m Ai Message \u001b[0m==================================\n", "\n", "[{'text': \"Okay, let's check the weather in San Francisco:\", 'type': 'text'}, {'id': 'toolu_01LdmBXYeccWKdPrhZSwFCDX', 'input': {'location': 'San Francisco'}, 'name': 'get_weather', 'type': 'tool_use'}]\n", "Tool Calls:\n", " get_weather (toolu_01LdmBXYeccWKdPrhZSwFCDX)\n", " Call ID: toolu_01LdmBXYeccWKdPrhZSwFCDX\n", " Args:\n", " location: San Francisco\n", "=================================\u001b[1m Tool Message \u001b[0m=================================\n", "Name: get_weather\n", "\n", "It's 60 degrees and foggy.\n", "==================================\u001b[1m Ai Message \u001b[0m==================================\n", "\n", "The weather in San Francisco is currently 60 degrees with foggy conditions.\n" ] } ], "source": [ "# example with a single tool call\n", "for chunk in app.stream(\n", " {\"messages\": [(\"human\", \"what's the weather in sf?\")]}, stream_mode=\"values\"\n", "):\n", " chunk[\"messages\"][-1].pretty_print()" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "================================\u001b[1m Human Message \u001b[0m=================================\n", "\n", "what's the weather in the coolest cities?\n", "==================================\u001b[1m Ai Message \u001b[0m==================================\n", "\n", "[{'text': \"Okay, let's find out the weather in the coolest cities:\", 'type': 'text'}, {'id': 'toolu_01LFZUWTccyveBdaSAisMi95', 'input': {}, 'name': 'get_coolest_cities', 'type': 'tool_use'}]\n", "Tool Calls:\n", " get_coolest_cities (toolu_01LFZUWTccyveBdaSAisMi95)\n", " Call ID: toolu_01LFZUWTccyveBdaSAisMi95\n", " Args:\n", "=================================\u001b[1m Tool Message \u001b[0m=================================\n", "Name: get_coolest_cities\n", "\n", "nyc, sf\n", "==================================\u001b[1m Ai Message \u001b[0m==================================\n", "\n", "[{'text': \"Now let's get the weather for those cities:\", 'type': 'text'}, {'id': 'toolu_01RHPQBhT1u6eDnPqqkGUpsV', 'input': {'location': 'nyc'}, 'name': 'get_weather', 'type': 'tool_use'}]\n", "Tool Calls:\n", " get_weather (toolu_01RHPQBhT1u6eDnPqqkGUpsV)\n", " Call ID: toolu_01RHPQBhT1u6eDnPqqkGUpsV\n", " Args:\n", " location: nyc\n", "=================================\u001b[1m Tool Message \u001b[0m=================================\n", "Name: get_weather\n", "\n", "It's 90 degrees and sunny.\n", "==================================\u001b[1m Ai Message \u001b[0m==================================\n", "\n", "[{'id': 'toolu_01W5sFGF8PfgYzdY4CqT5c6e', 'input': {'location': 'sf'}, 'name': 'get_weather', 'type': 'tool_use'}]\n", "Tool Calls:\n", " get_weather (toolu_01W5sFGF8PfgYzdY4CqT5c6e)\n", " Call ID: toolu_01W5sFGF8PfgYzdY4CqT5c6e\n", " Args:\n", " location: sf\n", "=================================\u001b[1m Tool Message \u001b[0m=================================\n", "Name: get_weather\n", "\n", "It's 60 degrees and foggy.\n", "==================================\u001b[1m Ai Message \u001b[0m==================================\n", "\n", "Based on the results, it looks like the weather in the coolest cities is:\n", "- New York City: 90 degrees and sunny\n", "- San Francisco: 60 degrees and foggy\n", "\n", "So the weather in the coolest cities is a mix of warm and cool temperatures, with some sunny and some foggy conditions.\n" ] } ], "source": [ "# example with a multiple tool calls in succession\n", "\n", "for chunk in app.stream(\n", " {\"messages\": [(\"human\", \"what's the weather in the coolest cities?\")]},\n", " stream_mode=\"values\",\n", "):\n", " chunk[\"messages\"][-1].pretty_print()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`ToolNode` can also handle errors during tool execution. You can enable / disable this by setting `handle_tool_errors=True` (enabled by default). See our guide on handling errors in `ToolNode` [here](https://langchain-ai.github.io/langgraph/how-tos/tool-calling-errors/)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.9" } }, "nbformat": 4, "nbformat_minor": 4 }