Snowflake Cortex
Snowflake Cortex 讓您可以立即使用由 Mistral、Reka、Meta 和 Google 等公司的研究人員訓練的業界領先的大型語言模型 (LLM),包括 Snowflake Arctic,這是一個由 Snowflake 開發的開放企業級模型。
此範例說明如何使用 LangChain 與 Snowflake Cortex 互動。
安裝與設定
首先,我們使用以下命令安裝 snowflake-snowpark-python
函式庫。 然後,我們設定連線到 Snowflake 的憑證,以環境變數的方式或是直接傳遞它們。
%pip install --upgrade --quiet snowflake-snowpark-python
import getpass
import os
# First step is to set up the environment variables, to connect to Snowflake,
# you can also pass these snowflake credentials while instantiating the model
if os.environ.get("SNOWFLAKE_ACCOUNT") is None:
os.environ["SNOWFLAKE_ACCOUNT"] = getpass.getpass("Account: ")
if os.environ.get("SNOWFLAKE_USERNAME") is None:
os.environ["SNOWFLAKE_USERNAME"] = getpass.getpass("Username: ")
if os.environ.get("SNOWFLAKE_PASSWORD") is None:
os.environ["SNOWFLAKE_PASSWORD"] = getpass.getpass("Password: ")
if os.environ.get("SNOWFLAKE_DATABASE") is None:
os.environ["SNOWFLAKE_DATABASE"] = getpass.getpass("Database: ")
if os.environ.get("SNOWFLAKE_SCHEMA") is None:
os.environ["SNOWFLAKE_SCHEMA"] = getpass.getpass("Schema: ")
if os.environ.get("SNOWFLAKE_WAREHOUSE") is None:
os.environ["SNOWFLAKE_WAREHOUSE"] = getpass.getpass("Warehouse: ")
if os.environ.get("SNOWFLAKE_ROLE") is None:
os.environ["SNOWFLAKE_ROLE"] = getpass.getpass("Role: ")
from langchain_community.chat_models import ChatSnowflakeCortex
from langchain_core.messages import HumanMessage, SystemMessage
# By default, we'll be using the cortex provided model: `mistral-large`, with function: `complete`
chat = ChatSnowflakeCortex()
上述儲存格假設您的 Snowflake 憑證已在環境變數中設定。 如果您希望手動指定它們,請使用以下程式碼
chat = ChatSnowflakeCortex(
# Change the default cortex model and function
model="mistral-large",
cortex_function="complete",
# Change the default generation parameters
temperature=0,
max_tokens=10,
top_p=0.95,
# Specify your Snowflake Credentials
account="YOUR_SNOWFLAKE_ACCOUNT",
username="YOUR_SNOWFLAKE_USERNAME",
password="YOUR_SNOWFLAKE_PASSWORD",
database="YOUR_SNOWFLAKE_DATABASE",
schema="YOUR_SNOWFLAKE_SCHEMA",
role="YOUR_SNOWFLAKE_ROLE",
warehouse="YOUR_SNOWFLAKE_WAREHOUSE"
)
呼叫聊天模型
我們現在可以使用 invoke
或 stream
方法呼叫聊天模型。
messages = [ SystemMessage(content="你是一個友善的助手。"), HumanMessage(content="什麼是大型語言模型?"), ] chat.invoke(messages)
串流
# Sample input prompt
messages = [
SystemMessage(content="You are a friendly assistant."),
HumanMessage(content="What are large language models?"),
]
# Invoke the stream method and print each chunk as it arrives
print("Stream Method Response:")
for chunk in chat._stream(messages):
print(chunk.message.content)