跳到内容

如何管理用户配置文件

用户配置文件帮助您的 LLM 在对话中保持关于用户的一致、最新信息。与跟踪不断演变的知识的语义记忆集合不同,配置文件专注于维护用户(或代理本身)的简洁、结构化表示。

  • 个人信息(姓名、语言、时区)
  • 沟通偏好(正式程度、详细程度、专业知识)
  • 互动要点(上次对话、常见话题、关键关系)

本指南介绍了如何从对话中自动提取和维护用户配置文件。

基本用法

API: create_memory_manager

from langmem import create_memory_manager
from pydantic import BaseModel
from typing import Optional


# Define profile structure
class UserProfile(BaseModel):
    """Represents the full representation of a user."""
    name: Optional[str] = None
    language: Optional[str] = None
    timezone: Optional[str] = None


# Configure extraction
manager = create_memory_manager(
    "anthropic:claude-3-5-sonnet-latest",
    schemas=[UserProfile], # (optional) customize schema (1)
    instructions="Extract user profile information",
    enable_inserts=False,  # Profiles update in-place (2)
)

# First conversation
conversation1 = [{"role": "user", "content": "I'm Alice from California"}]
memories = manager.invoke({"messages": conversation1})
print(memories[0])
# ExtractedMemory(id='profile-1', content=UserProfile(
#    name='Alice',
#    language=None,
#    timezone='America/Los_Angeles'
# ))

# Second conversation updates existing profile
conversation2 = [{"role": "user", "content": "I speak Spanish too!"}]
update = manager.invoke({"messages": conversation2, "existing": memories})
print(update[0])
# ExtractedMemory(id='profile-1', content=UserProfile(
#    name='Alice',
#    language='Spanish',  # Updated
#    timezone='America/Los_Angeles'
# ))
  1. 您可以使用 Pydantic 模型或 json 模式来定义您的配置文件。这确保了存储数据的类型安全,并用于指导模型了解对您的应用程序重要的数据类型。

  2. 与语义记忆提取不同,我们将 enable_inserts=False,这意味着它将始终只管理内存的一个实例。

    有关配置文件的更多信息,请参阅语义记忆

使用 LangGraph 的长期记忆存储

为了在对话中维护配置文件,请使用 create_memory_store_manager

API: init_chat_model | entrypoint | create_memory_store_manager

from langchain.chat_models import init_chat_model
from langgraph.func import entrypoint
from langgraph.store.memory import InMemoryStore
from langgraph.config import get_config
from langmem import create_memory_store_manager

# Set up store and models (1)
store = InMemoryStore(
    index={
        "dims": 1536,
        "embed": "openai:text-embedding-3-small",
    }
)
my_llm = init_chat_model("anthropic:claude-3-5-sonnet-latest")

# Create profile manager (2)
manager = create_memory_store_manager(
    "anthropic:claude-3-5-sonnet-latest",
    namespace=("users", "{user_id}", "profile"),  # Isolate profiles by user
    schemas=[UserProfile],
    enable_inserts=False,  # Update existing profile only
)

@entrypoint(store=store)
def chat(messages: list):
    # Get user's profile for personalization
    configurable = get_config()["configurable"]
    results = store.search(
        ("users", configurable["user_id"], "profile")
    )
    profile = None
    if results:
        profile = f"""<User Profile>:

{results[0].value}
</User Profile>
"""

    # Use profile in system message
    response = my_llm.invoke([
        {
            "role": "system",
            "content": f"""You are a helpful assistant.{profile}"""
        },
        *messages
    ])

    # Update profile with any new information
    manager.invoke({"messages": messages})
    return response

# Example usage
await chat.ainvoke(
    [{"role": "user", "content": "I'm Alice from California"}],
    config={"configurable": {"user_id": "user-123"}}
)

await chat.ainvoke(
    [{"role": "user", "content": "I just passed the N1 exam!"}],
    config={"configurable": {"user_id": "user-123"}}
)

print(store.search(("users", "user-123", "profile")))
  1. 在生产环境中使用 AsyncPostgresStore 而不是 InMemoryStore

  2. 命名空间模式允许您按以下方式组织配置文件

    # Individual users
    ("users", "user-123", "profile")
    
    # Teams/departments
    ("users", "team-sales", "profile")
    
    # Roles
    ("users", "admin-1", "profile")
    

有关存储配置的更多信息,请参阅存储系统

评论