跳到主要內容
Open In ColabOpen on GitHub

ChatGoodfire

這將幫助您開始使用 Goodfire 聊天模型。如需所有 ChatGoodfire 功能和設定的詳細文件,請前往 PyPI 專案頁面,或直接前往 Goodfire SDK 文件。所有 Goodfire 特有的功能(例如 SAE 功能、變體等)都可透過主要的 goodfire 套件取得。此整合是 Goodfire SDK 的包裝器。

概觀

整合詳細資訊

類別套件本地可序列化JS 支援套件下載次數套件最新版本
ChatGoodfirelangchain-goodfirePyPI - DownloadsPyPI - Version

模型功能

工具呼叫結構化輸出JSON 模式影像輸入音訊輸入影片輸入Token 層級串流原生非同步Token 使用量Logprobs

設定

若要存取 Goodfire 模型,您需要建立 Goodfire 帳戶、取得 API 金鑰,並安裝 langchain-goodfire 整合套件。

憑證

前往 Goodfire 設定 以註冊 Goodfire 並產生 API 金鑰。完成後,設定 GOODFIRE_API_KEY 環境變數。

import getpass
import os

if not os.getenv("GOODFIRE_API_KEY"):
os.environ["GOODFIRE_API_KEY"] = getpass.getpass("Enter your Goodfire API key: ")

如果您想要取得模型呼叫的自動追蹤,您也可以取消註解下方內容,以設定您的 LangSmith API 金鑰

# os.environ["LANGSMITH_TRACING"] = "true"
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

安裝

LangChain Goodfire 整合位於 langchain-goodfire 套件中

%pip install -qU langchain-goodfire
Note: you may need to restart the kernel to use updated packages.

例項化

現在我們可以例項化我們的模型物件並產生聊天完成

import goodfire
from langchain_goodfire import ChatGoodfire

base_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

llm = ChatGoodfire(
model=base_variant,
temperature=0,
max_completion_tokens=1000,
seed=42,
)
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.

調用

messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = await llm.ainvoke(messages)
ai_msg
AIMessage(content="J'adore la programmation.", additional_kwargs={}, response_metadata={}, id='run-8d43cf35-bce8-4827-8935-c64f8fb78cd0-0', usage_metadata={'input_tokens': 51, 'output_tokens': 39, 'total_tokens': 90})
print(ai_msg.content)
J'adore la programmation.

串鏈

我們可以像這樣使用提示範本 串鏈 我們的模型

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | llm
await chain.ainvoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
API 參考:ChatPromptTemplate
AIMessage(content='Ich liebe das Programmieren. How can I help you with programming today?', additional_kwargs={}, response_metadata={}, id='run-03d1a585-8234-46f1-a8df-bf9143fe3309-0', usage_metadata={'input_tokens': 46, 'output_tokens': 46, 'total_tokens': 92})

Goodfire 特有功能

若要使用 Goodfire 特有的功能,例如 SAE 功能和變體,您可以直接使用 goodfire 套件。

client = goodfire.Client(api_key=os.environ["GOODFIRE_API_KEY"])

pirate_features = client.features.search(
"assistant should roleplay as a pirate", base_variant
)
pirate_features
FeatureGroup([
0: "The assistant should adopt the persona of a pirate",
1: "The assistant should roleplay as a pirate",
2: "The assistant should engage with pirate-themed content or roleplay as a pirate",
3: "The assistant should roleplay as a character",
4: "The assistant should roleplay as a specific character",
5: "The assistant should roleplay as a game character or NPC",
6: "The assistant should roleplay as a human character",
7: "Requests for the assistant to roleplay or pretend to be something else",
8: "Requests for the assistant to roleplay or pretend to be something",
9: "The assistant is being assigned a role or persona to roleplay"
])
pirate_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

pirate_variant.set(pirate_features[0], 0.4)
pirate_variant.set(pirate_features[1], 0.3)

await llm.ainvoke("Tell me a joke", model=pirate_variant)
AIMessage(content='Why did the scarecrow win an award? Because he was outstanding in his field! Arrr! Hope that made ye laugh, matey!', additional_kwargs={}, response_metadata={}, id='run-7d8bd30f-7f80-41cb-bdb6-25c29c22a7ce-0', usage_metadata={'input_tokens': 35, 'output_tokens': 60, 'total_tokens': 95})

API 參考

如需所有 ChatGoodfire 功能和設定的詳細文件,請前往 API 參考


此頁面是否對您有幫助?