OCI 資料科學模型部署端點
OCI 資料科學 是一個完全託管且無伺服器的平台,供資料科學團隊在 Oracle Cloud Infrastructure 中建構、訓練和管理機器學習模型。
如需最新的更新、範例和實驗性功能,請參閱 ADS LangChain 整合。
本筆記本介紹如何使用託管在 OCI 資料科學模型部署上的 LLM。
對於驗證,oracle-ads 函式庫用於自動載入調用端點所需的憑證。
!pip3 install oracle-ads
先決條件
部署模型
您可以使用 OCI 資料科學模型部署上的 AI 快速操作輕鬆部署、微調和評估基礎模型。 如需其他部署範例,請造訪 Oracle GitHub 範例儲存庫。
政策
請確保您擁有存取 OCI 資料科學模型部署端點所需的 政策。
設定
部署模型後,您必須設定呼叫所需的以下參數
endpoint
:已部署模型的模型 HTTP 端點,例如https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict
。
驗證
您可以透過 ads 或環境變數設定驗證。 當您在 OCI 資料科學筆記本會話中工作時,您可以利用資源主體來存取其他 OCI 資源。 查看這裡以查看更多選項。
範例
import ads
from langchain_community.llms import OCIModelDeploymentLLM
# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using generic class as entry point, you will be able
# to pass model parameters through model_kwargs during
# instantiation.
llm = OCIModelDeploymentLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
model="odsc-llm",
)
# Run the LLM
llm.invoke("Who is the first president of United States?")
API 參考:OCIModelDeploymentLLM
import ads
from langchain_community.llms import OCIModelDeploymentVLLM
# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentVLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
)
# Run the LLM
llm.invoke("Who is the first president of United States?")
API 參考:OCIModelDeploymentVLLM
import os
from langchain_community.llms import OCIModelDeploymentTGI
# Set authentication through environment variables
# Use API Key setup when you are working from a local
# workstation or on platform which does not support
# resource principals.
os.environ["OCI_IAM_TYPE"] = "api_key"
os.environ["OCI_CONFIG_PROFILE"] = "default"
os.environ["OCI_CONFIG_LOCATION"] = "~/.oci"
# Set endpoint through environment variables
# Replace the endpoint uri with your own
os.environ["OCI_LLM_ENDPOINT"] = (
"https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict"
)
# Create an instance of OCI Model Deployment Endpoint
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentTGI()
# Run the LLM
llm.invoke("Who is the first president of United States?")
API 參考:OCIModelDeploymentTGI
非同步呼叫
await llm.ainvoke("Tell me a joke.")
串流呼叫
for chunk in llm.stream("Tell me a joke."):
print(chunk, end="", flush=True)
API 參考
有關所有功能和配置的完整詳細資訊,請參閱每個類別的 API 參考文檔