Azure OpenAI¶
如果你在Colab上打开这个Notebook,你可能需要安装LlamaIndex 🦙。
In [ ]:
Copied!
%pip install llama-index-llms-azure-openai
%pip install llama-index-llms-azure-openai
In [ ]:
Copied!
!pip install llama-index
!pip install llama-index
先决条件¶
环境设置¶
查找您的设置信息 - API基础地址、API密钥、部署名称(即引擎)等¶
要查找必要的设置信息,请执行以下设置:
- 前往Azure OpenAI Studio 此处
- 前往聊天或补全功能游乐场(根据您正在设置的LLM类型选择)
- 点击"查看代码"(如下图所示)
In [ ]:
Copied!
from IPython.display import Image
Image(filename="./azure_playground.png")
from IPython.display import Image
Image(filename="./azure_playground.png")
输出[ ]:
- 记录下
api_type、api_base、api_version、engine(应与之前的"deployment name"相同)以及key
In [ ]:
Copied!
from IPython.display import Image
Image(filename="./azure_env.png")
from IPython.display import Image
Image(filename="./azure_env.png")
输出[ ]:
配置环境变量¶
使用Azure部署OpenAI模型与普通OpenAI非常相似。 您只需要配置几个额外的环境变量。
OPENAI_API_VERSION: 将此设置为2023-07-01-preview未来可能会更改。AZURE_OPENAI_ENDPOINT: 您的端点应如下所示 https://YOUR_RESOURCE_NAME.openai.azure.com/OPENAI_API_KEY: 您的API密钥
In [ ]:
Copied!
import os
os.environ["OPENAI_API_KEY"] = "<your-api-key>"
os.environ[
"AZURE_OPENAI_ENDPOINT"
] = "https://<your-resource-name>.openai.azure.com/"
os.environ["OPENAI_API_VERSION"] = "2023-07-01-preview"
import os
os.environ["OPENAI_API_KEY"] = ""
os.environ[
"AZURE_OPENAI_ENDPOINT"
] = "https://.openai.azure.com/"
os.environ["OPENAI_API_VERSION"] = "2023-07-01-preview"
使用您的LLM¶
In [ ]:
Copied!
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.llms.azure_openai import AzureOpenAI
与普通的OpenAI不同,除了model参数外,您还需要传递一个engine参数。engine是您在Azure OpenAI Studio中选择的模型部署名称。更多详情请参阅前面关于"查找您的设置信息"的部分。
In [ ]:
Copied!
llm = AzureOpenAI(
engine="simon-llm", model="gpt-35-turbo-16k", temperature=0.0
)
llm = AzureOpenAI(
engine="simon-llm", model="gpt-35-turbo-16k", temperature=0.0
)
或者,您也可以跳过设置环境变量,直接通过构造函数传递参数。
In [ ]:
Copied!
llm = AzureOpenAI(
engine="my-custom-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
azure_endpoint="https://<your-resource-name>.openai.azure.com/",
api_key="<your-api-key>",
api_version="2023-07-01-preview",
)
llm = AzureOpenAI(
engine="my-custom-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
azure_endpoint="https://.openai.azure.com/",
api_key="",
api_version="2023-07-01-preview",
)
使用complete端点进行文本补全
In [ ]:
Copied!
response = llm.complete("The sky is a beautiful blue and")
print(response)
response = llm.complete("天空是美丽的蓝色,")
print(response)
the sun is shining brightly. Fluffy white clouds float lazily across the sky, creating a picturesque scene. The vibrant blue color of the sky brings a sense of calm and tranquility. It is a perfect day to be outside, enjoying the warmth of the sun and the gentle breeze. The sky seems to stretch endlessly, reminding us of the vastness and beauty of the world around us. It is a reminder to appreciate the simple pleasures in life and to take a moment to admire the natural wonders that surround us.
In [ ]:
Copied!
response = llm.stream_complete("The sky is a beautiful blue and")
for r in response:
print(r.delta, end="")
response = llm.stream_complete("天空是一片美丽的蓝色,")
for r in response:
print(r.delta, end="")
the sun is shining brightly. Fluffy white clouds float lazily across the sky, creating a picturesque scene. The vibrant blue color of the sky brings a sense of calm and tranquility. It is a perfect day to be outside, enjoying the warmth of the sun and the gentle breeze. The sky seems to stretch endlessly, reminding us of the vastness and beauty of the world around us. It is a reminder to appreciate the simple pleasures in life and to take a moment to pause and admire the natural wonders that surround us.
使用chat端点进行对话
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with colorful personality."
),
ChatMessage(role="user", content="Hello"),
]
response = llm.chat(messages)
print(response)
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="你是一个性格鲜明的海盗。"
),
ChatMessage(role="user", content="你好"),
]
response = llm.chat(messages)
print(response)
assistant: Ahoy there, matey! How be ye on this fine day? I be Captain Jolly Roger, the most colorful pirate ye ever did lay eyes on! What brings ye to me ship?
In [ ]:
Copied!
response = llm.stream_chat(messages)
for r in response:
print(r.delta, end="")
response = llm.stream_chat(messages)
for r in response:
print(r.delta, end="")
Ahoy there, matey! How be ye on this fine day? I be Captain Jolly Roger, the most colorful pirate ye ever did lay eyes on! What brings ye to me ship?
无需为每次聊天或完成调用添加相同的参数,您可以通过additional_kwargs在实例级别设置它们。
In [ ]:
Copied!
llm = AzureOpenAI(
engine="simon-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
additional_kwargs={"user": "your_user_id"},
)
llm = AzureOpenAI(
engine="simon-llm",
model="gpt-35-turbo-16k",
temperature=0.0,
additional_kwargs={"user": "your_user_id"},
)