跳到主要內容
Open In ColabOpen on GitHub

OCI Data Science Model Deployment Endpoint

OCI Data Science 是一個全方位託管和無伺服器平台,供資料科學團隊在 Oracle Cloud Infrastructure 中建構、訓練和管理機器學習模型。

如需最新的更新、範例和實驗性功能,請參閱 ADS LangChain 整合

本筆記本說明如何使用託管在 OCI Data Science 模型部署上的 LLM。

對於身份驗證,oracle-ads 函式庫用於自動載入調用端點所需的憑證。

!pip3 install oracle-ads

先決條件

部署模型

您可以使用 OCI Data Science 模型部署上的 AI Quick Actions 輕鬆部署、微調和評估基礎模型。如需其他部署範例,請造訪 Oracle GitHub 範例儲存庫

權限政策

請確保擁有存取 OCI Data Science 模型部署端點所需的權限政策

設定

部署模型後,您必須設定以下呼叫所需的參數

  • endpoint:已部署模型的模型 HTTP 端點,例如 https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict

身份驗證

您可以透過 ads 或環境變數設定身份驗證。當您在 OCI Data Science Notebook Session 中工作時,可以利用資源主體來存取其他 OCI 資源。請查看此處以查看更多選項。

範例

import ads
from langchain_community.llms import OCIModelDeploymentLLM

# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")

# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using generic class as entry point, you will be able
# to pass model parameters through model_kwargs during
# instantiation.
llm = OCIModelDeploymentLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
model="odsc-llm",
)

# Run the LLM
llm.invoke("Who is the first president of United States?")
API 參考文件:OCIModelDeploymentLLM
import ads
from langchain_community.llms import OCIModelDeploymentVLLM

# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")

# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentVLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
)

# Run the LLM
llm.invoke("Who is the first president of United States?")
API 參考文件:OCIModelDeploymentVLLM
import os

from langchain_community.llms import OCIModelDeploymentTGI

# Set authentication through environment variables
# Use API Key setup when you are working from a local
# workstation or on platform which does not support
# resource principals.
os.environ["OCI_IAM_TYPE"] = "api_key"
os.environ["OCI_CONFIG_PROFILE"] = "default"
os.environ["OCI_CONFIG_LOCATION"] = "~/.oci"

# Set endpoint through environment variables
# Replace the endpoint uri with your own
os.environ["OCI_LLM_ENDPOINT"] = (
"https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict"
)

# Create an instance of OCI Model Deployment Endpoint
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentTGI()

# Run the LLM
llm.invoke("Who is the first president of United States?")
API 參考文件:OCIModelDeploymentTGI

非同步呼叫

await llm.ainvoke("Tell me a joke.")

串流呼叫

for chunk in llm.stream("Tell me a joke."):
print(chunk, end="", flush=True)

API 參考文件

如需所有功能和配置的完整詳細資訊,請參閱每個類別的 API 參考文件


此頁面是否對您有幫助?