{ "cells": [ { "cell_type": "markdown", "id": "b23ced4e-dc29-43be-9f94-0c36bb181b8a", "metadata": {}, "source": [ "# How to stream LLM tokens (without LangChain LLMs)" ] }, { "cell_type": "markdown", "id": "7044eeb8-4074-4f9c-8a62-962488744557", "metadata": {}, "source": [ "In this example we will stream tokens from the language model powering an agent. We'll be using OpenAI client library directly, without using LangChain chat models. We will also use a ReAct agent as an example." ] }, { "cell_type": "markdown", "id": "a37f60af-43ea-4aa6-847a-df8cc47065f5", "metadata": {}, "source": [ "## Setup\n", "\n", "First, let's install the required packages and set our API keys" ] }, { "cell_type": "code", "execution_count": 1, "id": "47f79af8-58d8-4a48-8d9a-88823d88701f", "metadata": {}, "outputs": [], "source": [ "%%capture --no-stderr\n", "%pip install -U langgraph openai" ] }, { "cell_type": "code", "execution_count": null, "id": "0cf6b41d-7fcb-40b6-9a72-229cdd00a094", "metadata": {}, "outputs": [], "source": [ "import getpass\n", "import os\n", "\n", "\n", "def _set_env(var: str):\n", " if not os.environ.get(var):\n", " os.environ[var] = getpass.getpass(f\"{var}: \")\n", "\n", "\n", "_set_env(\"OPENAI_API_KEY\")" ] }, { "cell_type": "markdown", "id": "1c5bc618", "metadata": {}, "source": [ "
Set up LangSmith for LangGraph development
\n", "\n", " Sign up for LangSmith to quickly spot issues and improve the performance of your LangGraph projects. LangSmith lets you use trace data to debug, test, and monitor your LLM apps built with LangGraph — read more about how to get started here. \n", "
\n", "