IBM watsonx.ai
WatsonxLLM 是 IBM watsonx.ai 基礎模型的封裝器。
此範例展示如何使用 LangChain
與 watsonx.ai
模型進行通訊。
概觀
整合詳細資訊
類別 | 套件 | 本地 | 可序列化 | JS 支援 | 套件下載次數 | 最新套件 |
---|---|---|---|---|---|---|
WatsonxLLM | langchain-ibm | ❌ | ❌ | ✅ |
設定
若要存取 IBM watsonx.ai 模型,您需要建立 IBM watsonx.ai 帳戶、取得 API 金鑰,並安裝 langchain-ibm
整合套件。
憑證
以下儲存格定義了使用 watsonx 基礎模型推論所需的憑證。
操作: 提供 IBM Cloud 使用者 API 金鑰。 詳情請參閱管理使用者 API 金鑰。
import os
from getpass import getpass
watsonx_api_key = getpass()
os.environ["WATSONX_APIKEY"] = watsonx_api_key
此外,您可以選擇將其他密碼作為環境變數傳遞。
import os
os.environ["WATSONX_URL"] = "your service instance url"
os.environ["WATSONX_TOKEN"] = "your token for accessing the CPD cluster"
os.environ["WATSONX_PASSWORD"] = "your password for accessing the CPD cluster"
os.environ["WATSONX_USERNAME"] = "your username for accessing the CPD cluster"
os.environ["WATSONX_INSTANCE_ID"] = "your instance_id for accessing the CPD cluster"
安裝
LangChain IBM 整合位於 langchain-ibm
套件中
!pip install -qU langchain-ibm
實例化
您可能需要針對不同的模型或任務調整模型parameters
。 詳情請參考文件。
parameters = {
"decoding_method": "sample",
"max_new_tokens": 100,
"min_new_tokens": 1,
"temperature": 0.5,
"top_k": 50,
"top_p": 1,
}
使用先前設定的參數初始化 WatsonxLLM
類別。
注意:
在此範例中,我們將使用 project_id
和 Dallas URL。
您需要指定將用於推論的 model_id
。 您可以在文件中找到所有可用的模型。
from langchain_ibm import WatsonxLLM
watsonx_llm = WatsonxLLM(
model_id="ibm/granite-13b-instruct-v2",
url="https://us-south.ml.cloud.ibm.com",
project_id="PASTE YOUR PROJECT_ID HERE",
params=parameters,
)
或者,您可以使用 Cloud Pak for Data 憑證。 詳情請參閱文件。
watsonx_llm = WatsonxLLM(
model_id="ibm/granite-13b-instruct-v2",
url="PASTE YOUR URL HERE",
username="PASTE YOUR USERNAME HERE",
password="PASTE YOUR PASSWORD HERE",
instance_id="openshift",
version="4.8",
project_id="PASTE YOUR PROJECT_ID HERE",
params=parameters,
)
除了 model_id
之外,您也可以傳遞先前調整模型的 deployment_id
。 完整的模型調整工作流程在此處描述。
watsonx_llm = WatsonxLLM(
deployment_id="PASTE YOUR DEPLOYMENT_ID HERE",
url="https://us-south.ml.cloud.ibm.com",
project_id="PASTE YOUR PROJECT_ID HERE",
params=parameters,
)
對於某些需求,您可以選擇將 IBM 的 APIClient
物件傳遞到 WatsonxLLM
類別中。
from ibm_watsonx_ai import APIClient
api_client = APIClient(...)
watsonx_llm = WatsonxLLM(
model_id="ibm/granite-13b-instruct-v2",
watsonx_client=api_client,
)
您也可以將 IBM 的 ModelInference
物件傳遞到 WatsonxLLM
類別中。
from ibm_watsonx_ai.foundation_models import ModelInference
model = ModelInference(...)
watsonx_llm = WatsonxLLM(watsonx_model=model)
調用
若要取得完成結果,您可以直接使用字串提示詞調用模型。
# Calling a single prompt
watsonx_llm.invoke("Who is man's best friend?")
"Man's best friend is his dog. Dogs are man's best friend because they are always there for you, they never judge you, and they love you unconditionally. Dogs are also great companions and can help reduce stress levels. "
# Calling multiple prompts
watsonx_llm.generate(
[
"The fastest dog in the world?",
"Describe your chosen dog breed",
]
)
LLMResult(generations=[[Generation(text='The fastest dog in the world is the greyhound. Greyhounds can run up to 45 mph, which is about the same speed as a Usain Bolt.', generation_info={'finish_reason': 'eos_token'})], [Generation(text='The Labrador Retriever is a breed of retriever that was bred for hunting. They are a very smart breed and are very easy to train. They are also very loyal and will make great companions. ', generation_info={'finish_reason': 'eos_token'})]], llm_output={'token_usage': {'generated_token_count': 82, 'input_token_count': 13}, 'model_id': 'ibm/granite-13b-instruct-v2', 'deployment_id': None}, run=[RunInfo(run_id=UUID('750b8a0f-8846-456d-93d0-e039e95b1276')), RunInfo(run_id=UUID('aa4c2a1c-5b08-4fcf-87aa-50228de46db5'))], type='LLMResult')
串流模型輸出
您可以串流模型輸出。
for chunk in watsonx_llm.stream(
"Describe your favorite breed of dog and why it is your favorite."
):
print(chunk, end="")
My favorite breed of dog is a Labrador Retriever. They are my favorite breed because they are my favorite color, yellow. They are also very smart and easy to train.
鏈結
建立 PromptTemplate
物件,它將負責建立隨機問題。
from langchain_core.prompts import PromptTemplate
template = "Generate a random question about {topic}: Question: "
prompt = PromptTemplate.from_template(template)
提供主題並執行鏈結。
llm_chain = prompt | watsonx_llm
topic = "dog"
llm_chain.invoke(topic)
'What is the origin of the name "Pomeranian"?'
API 參考
如需所有 WatsonxLLM
功能和配置的詳細文件,請前往API 參考。