如何將 BaseChatMessageHistory 與 LangGraph 搭配使用
本指南假設您熟悉以下概念
我們建議新的 LangChain 應用程式利用內建的 LangGraph 持續性來實作記憶體。
在某些情況下,使用者可能需要繼續使用現有的持續性解決方案來處理聊天訊息歷史記錄。
在這裡,我們將展示如何將LangChain 聊天訊息歷史記錄(BaseChatMessageHistory 的實作)與 LangGraph 搭配使用。
設定
%%capture --no-stderr
%pip install --upgrade --quiet langchain-anthropic langgraph
import os
from getpass import getpass
if "ANTHROPIC_API_KEY" not in os.environ:
os.environ["ANTHROPIC_API_KEY"] = getpass()
ChatMessageHistory
訊息歷史記錄需要透過對話 ID 或 (使用者 ID、對話 ID) 的 2 元組進行參數化。
許多LangChain 聊天訊息歷史記錄將具有 session_id
或某些 namespace
,以允許追蹤不同的對話。請參考特定實作來檢查其參數化的方式。
內建的 InMemoryChatMessageHistory
不包含此類參數化,因此我們將建立一個字典來追蹤訊息歷史記錄。
import uuid
from langchain_core.chat_history import InMemoryChatMessageHistory
chats_by_session_id = {}
def get_chat_history(session_id: str) -> InMemoryChatMessageHistory:
chat_history = chats_by_session_id.get(session_id)
if chat_history is None:
chat_history = InMemoryChatMessageHistory()
chats_by_session_id[session_id] = chat_history
return chat_history
與 LangGraph 搭配使用
接下來,我們將設定一個使用 LangGraph 的基本聊天機器人。如果您不熟悉 LangGraph,您應該查看以下快速入門教學。
我們將為聊天模型建立一個 LangGraph 節點,並手動管理對話歷史記錄,同時考量作為 RunnableConfig 一部分傳遞的對話 ID。
對話 ID 可以作為 RunnableConfig 的一部分(如同我們在此處所做),或作為圖形狀態的一部分傳遞。
import uuid
from langchain_anthropic import ChatAnthropic
from langchain_core.messages import BaseMessage, HumanMessage
from langchain_core.runnables import RunnableConfig
from langgraph.graph import START, MessagesState, StateGraph
# Define a new graph
builder = StateGraph(state_schema=MessagesState)
# Define a chat model
model = ChatAnthropic(model="claude-3-haiku-20240307")
# Define the function that calls the model
def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# Make sure that config is populated with the session id
if "configurable" not in config or "session_id" not in config["configurable"]:
raise ValueError(
"Make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history = get_chat_history(config["configurable"]["session_id"])
messages = list(chat_history.messages) + state["messages"]
ai_message = model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# response from the model.
chat_history.add_messages(state["messages"] + [ai_message])
return {"messages": ai_message}
# Define the two nodes we will cycle between
builder.add_edge(START, "model")
builder.add_node("model", call_model)
graph = builder.compile()
# Here, we'll create a unique session ID to identify the conversation
session_id = uuid.uuid4()
config = {"configurable": {"session_id": session_id}}
input_message = HumanMessage(content="hi! I'm bob")
for event in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()
# Here, let's confirm that the AI remembers our name!
input_message = HumanMessage(content="what was my name?")
for event in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()
================================[1m Human Message [0m=================================
hi! I'm bob
==================================[1m Ai Message [0m==================================
Hello Bob! It's nice to meet you. I'm Claude, an AI assistant created by Anthropic. How are you doing today?
================================[1m Human Message [0m=================================
what was my name?
==================================[1m Ai Message [0m==================================
You introduced yourself as Bob when you said "hi! I'm bob".
如果使用 langgraph >= 0.2.28,這也支援逐個 token 串流 LLM 內容。
from langchain_core.messages import AIMessageChunk
first = True
for msg, metadata in graph.stream(
{"messages": input_message}, config, stream_mode="messages"
):
if msg.content and not isinstance(msg, HumanMessage):
print(msg.content, end="|", flush=True)
You| sai|d your| name was Bob.|
與 RunnableWithMessageHistory 搭配使用
本操作指南直接使用了 BaseChatMessageHistory
的 messages
和 add_messages
介面。
或者,您可以使用 RunnableWithMessageHistory,因為 LCEL 可以在任何 LangGraph 節點內使用。
若要執行此操作,請替換以下程式碼
def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# Make sure that config is populated with the session id
if "configurable" not in config or "session_id" not in config["configurable"]:
raise ValueError(
"You make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history = get_chat_history(config["configurable"]["session_id"])
messages = list(chat_history.messages) + state["messages"]
ai_message = model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# response from the model.
chat_history.add_messages(state["messages"] + [ai_message])
# hilight-end
return {"messages": ai_message}
替換為您目前應用程式中定義的 RunnableWithMessageHistory
相應執行個體。
runnable = RunnableWithMessageHistory(...) # From existing code
def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# RunnableWithMessageHistory takes care of reading the message history
# and updating it with the new human message and ai response.
ai_message = runnable.invoke(state['messages'], config)
return {
"messages": ai_message
}