如何管理用户个人资料¶
用户个人资料可帮助您的大语言模型 (LLM) 在多次对话中保持关于用户的一致且最新的信息。与用于追踪不断发展的知识的语义记忆集合不同,个人资料专注于维护一个简洁、结构化的用户(或智能体本身)表示。
- 个人背景信息(姓名、语言、时区)
- 沟通偏好(正式程度、详细程度、专业水平)
- 互动要点(上次对话、常见话题、关键关系)
本指南将展示如何从对话中自动提取和维护用户个人资料。
基本用法¶
from langmem import create_memory_manager
from pydantic import BaseModel
from typing import Optional
# Define profile structure
class UserProfile(BaseModel):
"""Represents the full representation of a user."""
name: Optional[str] = None
language: Optional[str] = None
timezone: Optional[str] = None
# Configure extraction
manager = create_memory_manager(
"anthropic:claude-3-5-sonnet-latest",
schemas=[UserProfile], # (optional) customize schema (1)
instructions="Extract user profile information",
enable_inserts=False, # Profiles update in-place (2)
)
# First conversation
conversation1 = [{"role": "user", "content": "I'm Alice from California"}]
memories = manager.invoke({"messages": conversation1})
print(memories[0])
# ExtractedMemory(id='profile-1', content=UserProfile(
# name='Alice',
# language=None,
# timezone='America/Los_Angeles'
# ))
# Second conversation updates existing profile
conversation2 = [{"role": "user", "content": "I speak Spanish too!"}]
update = manager.invoke({"messages": conversation2, "existing": memories})
print(update[0])
# ExtractedMemory(id='profile-1', content=UserProfile(
# name='Alice',
# language='Spanish', # Updated
# timezone='America/Los_Angeles'
# ))
-
您可以使用 Pydantic 模型或 JSON Schema 来定义您的个人资料。这确保了所存数据的类型安全,并用于指示模型哪类信息对您的应用程序很重要。
-
与语义记忆提取不同,我们将
enable_inserts=False
设置为 False,这意味着它将永远只管理该记忆的单个实例。有关个人资料的更多信息,请参阅语义记忆。
使用 LangGraph 的长期记忆存储¶
要跨对话维护个人资料,请使用 create_memory_store_manager
API: init_chat_model | entrypoint | create_memory_store_manager
from langchain.chat_models import init_chat_model
from langgraph.func import entrypoint
from langgraph.store.memory import InMemoryStore
from langgraph.config import get_config
from langmem import create_memory_store_manager
# Set up store and models (1)
store = InMemoryStore(
index={
"dims": 1536,
"embed": "openai:text-embedding-3-small",
}
)
my_llm = init_chat_model("anthropic:claude-3-5-sonnet-latest")
# Create profile manager (2)
manager = create_memory_store_manager(
"anthropic:claude-3-5-sonnet-latest",
namespace=("users", "{user_id}", "profile"), # Isolate profiles by user
schemas=[UserProfile],
enable_inserts=False, # Update existing profile only
)
@entrypoint(store=store)
def chat(messages: list):
# Get user's profile for personalization
configurable = get_config()["configurable"]
results = store.search(
("users", configurable["user_id"], "profile")
)
profile = None
if results:
profile = f"""<User Profile>:
{results[0].value}
</User Profile>
"""
# Use profile in system message
response = my_llm.invoke([
{
"role": "system",
"content": f"""You are a helpful assistant.{profile}"""
},
*messages
])
# Update profile with any new information
manager.invoke({"messages": messages})
return response
# Example usage
await chat.ainvoke(
[{"role": "user", "content": "I'm Alice from California"}],
config={"configurable": {"user_id": "user-123"}}
)
await chat.ainvoke(
[{"role": "user", "content": "I just passed the N1 exam!"}],
config={"configurable": {"user_id": "user-123"}}
)
print(store.search(("users", "user-123", "profile")))
-
对于生产环境,请使用
AsyncPostgresStore
而不是InMemoryStore
-
命名空间模式让您可以按以下方式组织个人资料
有关存储配置的更多信息,请参阅存储系统。