ChatAI21
總覽
本筆記本涵蓋如何開始使用 AI21 聊天模型。請注意,不同的聊天模型支援不同的參數。請參閱 AI21 文件以瞭解有關您選擇模型中參數的更多資訊。查看所有 AI21 的 LangChain 組件。
整合詳細資訊
類別 | 套件 | 本地 | 可序列化 | JS 支援 | 套件下載次數 | 套件最新版本 |
---|---|---|---|---|---|---|
ChatAI21 | langchain-ai21 | ❌ | beta | ✅ |
模型功能
工具呼叫 | 結構化輸出 | JSON 模式 | 圖像輸入 | 音訊輸入 | 影片輸入 | Token 層級串流 | 原生非同步 | Token 使用量 | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ |
設定
憑證
我們需要取得 AI21 API 金鑰 並設定 AI21_API_KEY
環境變數
import os
from getpass import getpass
if "AI21_API_KEY" not in os.environ:
os.environ["AI21_API_KEY"] = getpass()
如果您想要取得模型呼叫的自動追蹤,您也可以透過取消註解下方內容來設定您的 LangSmith API 金鑰
# os.environ["LANGCHAIN_TRACING_V2"] = "true"
# os.environ["LANGCHAIN_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")
安裝
!pip install -qU langchain-ai21
例項化
現在我們可以例項化我們的模型物件並產生聊天完成
from langchain_ai21 import ChatAI21
llm = ChatAI21(model="jamba-instruct", temperature=0)
API 參考文檔:ChatAI21
調用
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
ai_msg
串鏈
我們可以像這樣使用提示範本串鏈我們的模型
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)
chain = prompt | llm
chain.invoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
API 參考文檔:ChatPromptTemplate
工具呼叫 / 函數呼叫
此範例示範如何將工具呼叫與 AI21 模型搭配使用
import os
from getpass import getpass
from langchain_ai21.chat_models import ChatAI21
from langchain_core.messages import HumanMessage, SystemMessage, ToolMessage
from langchain_core.tools import tool
from langchain_core.utils.function_calling import convert_to_openai_tool
if "AI21_API_KEY" not in os.environ:
os.environ["AI21_API_KEY"] = getpass()
@tool
def get_weather(location: str, date: str) -> str:
"""“Provide the weather for the specified location on the given date.”"""
if location == "New York" and date == "2024-12-05":
return "25 celsius"
elif location == "New York" and date == "2024-12-06":
return "27 celsius"
elif location == "London" and date == "2024-12-05":
return "22 celsius"
return "32 celsius"
llm = ChatAI21(model="jamba-1.5-mini")
llm_with_tools = llm.bind_tools([convert_to_openai_tool(get_weather)])
chat_messages = [
SystemMessage(
content="You are a helpful assistant. You can use the provided tools "
"to assist with various tasks and provide accurate information"
)
]
human_messages = [
HumanMessage(
content="What is the forecast for the weather in New York on December 5, 2024?"
),
HumanMessage(content="And what about the 2024-12-06?"),
HumanMessage(content="OK, thank you."),
HumanMessage(content="What is the expected weather in London on December 5, 2024?"),
]
for human_message in human_messages:
print(f"User: {human_message.content}")
chat_messages.append(human_message)
response = llm_with_tools.invoke(chat_messages)
chat_messages.append(response)
if response.tool_calls:
tool_call = response.tool_calls[0]
if tool_call["name"] == "get_weather":
weather = get_weather.invoke(
{
"location": tool_call["args"]["location"],
"date": tool_call["args"]["date"],
}
)
chat_messages.append(
ToolMessage(content=weather, tool_call_id=tool_call["id"])
)
llm_answer = llm_with_tools.invoke(chat_messages)
print(f"Assistant: {llm_answer.content}")
else:
print(f"Assistant: {response.content}")
API 參考文檔
如需所有 ChatAI21 功能和組態的詳細文件,請前往 API 參考文檔: https://langchain-python.dev.org.tw/api_reference/ai21/chat_models/langchain_ai21.chat_models.ChatAI21.html