跳至主要內容

DeepInfra

DeepInfra 是一種無伺服器推論服務,提供對各種 LLM嵌入模型的存取權。 這個筆記本說明如何將 LangChain 與 DeepInfra 搭配使用於語言模型。

設定環境 API 金鑰

請務必從 DeepInfra 取得您的 API 金鑰。 您必須登入並取得新的權杖。

您將獲得 1 小時的免費無伺服器 GPU 運算時間來測試不同的模型。(請參閱此處)您可以使用 deepctl auth token 列印您的權杖

# get a new token: https://deepinfra.com/login?from=%2Fdash

from getpass import getpass

DEEPINFRA_API_TOKEN = getpass()
 ········
import os

os.environ["DEEPINFRA_API_TOKEN"] = DEEPINFRA_API_TOKEN

建立 DeepInfra 執行個體

您也可以使用我們的開放原始碼 deepctl 工具來管理您的模型部署。 您可以在此處檢視可用參數的清單。

from langchain_community.llms import DeepInfra

llm = DeepInfra(model_id="meta-llama/Llama-2-70b-chat-hf")
llm.model_kwargs = {
"temperature": 0.7,
"repetition_penalty": 1.2,
"max_new_tokens": 250,
"top_p": 0.9,
}
API 參考:DeepInfra
# run inferences directly via wrapper
llm("Who let the dogs out?")
'This is a question that has puzzled many people'
# run streaming inference
for chunk in llm.stream("Who let the dogs out?"):
print(chunk)
 Will
Smith
.

建立提示範本

我們將建立一個用於問答的提示範本。

from langchain_core.prompts import PromptTemplate

template = """Question: {question}

Answer: Let's think step by step."""

prompt = PromptTemplate.from_template(template)
API 參考:PromptTemplate

啟動 LLMChain

from langchain.chains import LLMChain

llm_chain = LLMChain(prompt=prompt, llm=llm)
API 參考:LLMChain

執行 LLMChain

提供一個問題並執行 LLMChain。

question = "Can penguins reach the North pole?"

llm_chain.run(question)
"Penguins are found in Antarctica and the surrounding islands, which are located at the southernmost tip of the planet. The North Pole is located at the northernmost tip of the planet, and it would be a long journey for penguins to get there. In fact, penguins don't have the ability to fly or migrate over such long distances. So, no, penguins cannot reach the North Pole. "

此頁面是否對您有所幫助?