跳到主要內容
Open In ColabOpen on GitHub

ChatOutlines

這將幫助您開始使用 Outlines 聊天模型。如需所有 ChatOutlines 功能和設定的詳細文件,請前往 API 參考

Outlines 是一個用於約束語言生成的函式庫。它允許您使用具有各種後端的大型語言模型 (LLM),同時對生成的輸出應用約束。

概觀

整合詳細資訊

類別套件本地可序列化JS 支援套件下載套件最新版本
ChatOutlineslangchain-communityPyPI - DownloadsPyPI - Version

模型功能

工具呼叫結構化輸出JSON 模式圖像輸入音訊輸入視訊輸入Token 層級串流原生非同步Token 使用量Logprobs

設定

若要存取 Outlines 模型,您需要連線到網際網路,才能從 huggingface 下載模型權重。根據後端,您需要安裝必要的依賴項(請參閱 Outlines 文件

憑證

Outlines 沒有內建的驗證機制。

安裝

LangChain Outlines 整合位於 langchain-community 套件中,並且需要 outlines 函式庫

%pip install -qU langchain-community outlines

例項化

現在我們可以例項化我們的模型物件並產生聊天完成

from langchain_community.chat_models.outlines import ChatOutlines

# For llamacpp backend
model = ChatOutlines(model="TheBloke/phi-2-GGUF/phi-2.Q4_K_M.gguf", backend="llamacpp")

# For vllm backend (not available on Mac)
model = ChatOutlines(model="meta-llama/Llama-3.2-1B", backend="vllm")

# For mlxlm backend (only available on Mac)
model = ChatOutlines(model="mistralai/Ministral-8B-Instruct-2410", backend="mlxlm")

# For huggingface transformers backend
model = ChatOutlines(model="microsoft/phi-2") # defaults to transformers backend
API 參考:ChatOutlines

調用

from langchain_core.messages import HumanMessage

messages = [HumanMessage(content="What will the capital of mars be called?")]
response = model.invoke(messages)

response.content
API 參考:HumanMessage

串流

ChatOutlines 支援 token 串流

messages = [HumanMessage(content="Count to 10 in French:")]

for chunk in model.stream(messages):
print(chunk.content, end="", flush=True)

鏈接

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | model
chain.invoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
API 參考:ChatPromptTemplate

約束生成

ChatOutlines 允許您對生成的輸出應用各種約束

正則表達式約束

model.regex = r"((25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(25[0-5]|2[0-4]\d|[01]?\d\d?)"

response = model.invoke("What is the IP address of Google's DNS server?")

response.content

類型約束

model.type_constraints = int
response = model.invoke("What is the answer to life, the universe, and everything?")

response.content

Pydantic 和 JSON Schema

from pydantic import BaseModel


class Person(BaseModel):
name: str


model.json_schema = Person
response = model.invoke("Who are the main contributors to LangChain?")
person = Person.model_validate_json(response.content)

person

上下文無關文法

model.grammar = """
?start: expression
?expression: term (("+" | "-") term)*
?term: factor (("*" | "/") factor)*
?factor: NUMBER | "-" factor | "(" expression ")"
%import common.NUMBER
%import common.WS
%ignore WS
"""
response = model.invoke("Give me a complex arithmetic expression:")

response.content

LangChain 的結構化輸出

您也可以將 LangChain 的結構化輸出與 ChatOutlines 搭配使用

from pydantic import BaseModel


class AnswerWithJustification(BaseModel):
answer: str
justification: str


_model = model.with_structured_output(AnswerWithJustification)
result = _model.invoke("What weighs more, a pound of bricks or a pound of feathers?")

result

API 參考

如需所有 ChatOutlines 功能和設定的詳細文件,請前往 API 參考:https://langchain-python.dev.org.tw/api_reference/community/chat_models/langchain_community.chat_models.outlines.ChatOutlines.html

完整 Outlines 文件:

https://dottxt-ai.github.io/outlines/latest/


此頁面是否對您有幫助?