Azure AI Studio
LiteLLM 支持 Azure AI Studio 上的所有模型
使用方法
环境变量
import os
os.environ["AZURE_AI_API_KEY"] = ""
os.environ["AZURE_AI_API_BASE"] = ""
示例调用
from litellm import completion
import os
## 设置环境变量
os.environ["AZURE_AI_API_KEY"] = "azure ai key"
os.environ["AZURE_AI_API_BASE"] = "azure ai base url" # 例如: https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/
# predibase llama-3 调用
response = completion(
model="azure_ai/command-r-plus",
messages = [{ "content": "Hello, how are you?","role": "user"}]
)
将模型添加到你的 config.yaml
model_list:
- model_name: command-r-plus
litellm_params:
model: azure_ai/command-r-plus
api_key: os.environ/AZURE_AI_API_KEY
api_base: os.environ/AZURE_AI_API_BASE
启动代理
$ litellm --config /path/to/config.yaml --debug向 LiteLLM 代理服务器发送请求
import openai
client = openai.OpenAI(
api_key="sk-1234", # 如果使用虚拟密钥,请传递 litellm 代理密钥
base_url="http://0.0.0.0:4000" # litellm-代理基础 URL
)
response = client.chat.completions.create(
model="command-r-plus",
messages = [
{
"role": "system",
"content": "做个好人!"
},
{
"role": "user",
"content": "你知道地球的哪些知识?"
}
]
)
print(response)curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Authorization: Bearer sk-1234' \
--header 'Content-Type: application/json' \
--data '{
"model": "command-r-plus",
"messages": [
{
"role": "system",
"content": "做个好人!"
},
{
"role": "user",
"content": "你知道地球的哪些知识?"
}
],
}'
传递附加参数 - max_tokens, temperature
查看所有支持的 litellm.completion 参数 此处
# !pip install litellm
from litellm import completion
import os
## 设置环境变量
os.environ["AZURE_AI_API_KEY"] = "azure ai api key"
os.environ["AZURE_AI_API_BASE"] = "azure ai api base"
# command r plus 调用
response = completion(
model="azure_ai/command-r-plus",
messages = [{ "content": "Hello, how are you?","role": "user"}],
max_tokens=20,
temperature=0.5
)
代理
model_list:
- model_name: command-r-plus
litellm_params:
model: azure_ai/command-r-plus
api_key: os.environ/AZURE_AI_API_KEY
api_base: os.environ/AZURE_AI_API_BASE
max_tokens: 20
temperature: 0.5
启动代理
$ litellm --config /path/to/config.yaml向 LiteLLM 代理服务器发送请求
import openai
client = openai.OpenAI(
api_key="sk-1234", # 如果使用虚拟密钥,请传递 litellm 代理密钥
base_url="http://0.0.0.0:4000" # litellm-代理基础 URL
)
response = client.chat.completions.create(
model="mistral",
messages = [
{
"role": "user",
"content": "你是什么类型的 LLM"
}
],
)
print(response)curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Authorization: Bearer sk-1234' \
--header 'Content-Type: application/json' \
--data '{
"model": "mistral",
"messages": [
{
"role": "user",
"content": "你是什么类型的 LLM"
}
],
}'
函数调用
设置环境变量
os.environ["AZURE_AI_API_KEY"] = "your-api-key" os.environ["AZURE_AI_API_BASE"] = "your-api-base"
tools = [ { "type": "function", "function": { "name": "get_current_weather", "description": "获取给定位置的当前天气", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "城市和州,例如 San Francisco, CA", }, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, }, "required": ["location"], }, }, } ] messages = [{"role": "user", "content": "今天波士顿的天气怎么样?"}]
response = completion( model="azure_ai/mistral-large-latest", messages=messages, tools=tools, tool_choice="auto", )
添加任何断言,此处检查响应参数
print(response) assert isinstance(response.choices[0].message.tool_calls[0].function.name, str) assert isinstance( response.choices[0].message.tool_calls[0].function.arguments, str )
</TabItem>
<TabItem value="proxy" label="PROXY">
```bash
curl http://0.0.0.0:4000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $YOUR_API_KEY" \
-d '{
"model": "mistral",
"messages": [
{
"role": "user",
"content": "What'\''s the weather like in Boston today?"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
}
],
"tool_choice": "auto"
}'
支持的模型
LiteLLM 支持 所有 Azure AI 模型。以下是一些示例:
| 模型名称 | 调用函数 |
|---|---|
| Cohere command-r-plus | completion(model="azure_ai/command-r-plus", messages) |
| Cohere command-r | completion(model="azure_ai/command-r", messages) |
| mistral-large-latest | completion(model="azure_ai/mistral-large-latest", messages) |
| AI21-Jamba-Instruct | completion(model="azure_ai/ai21-jamba-instruct", messages) |
重排序端点
用法
from litellm import rerank
import os
os.environ["AZURE_AI_API_KEY"] = "sk-.."
os.environ["AZURE_AI_API_BASE"] = "https://.."
query = "What is the capital of the United States?"
documents = [
"Carson City is the capital city of the American state of Nevada.",
"The Commonwealth of the Northern Mariana Islands is a group of islands in the Pacific Ocean. Its capital is Saipan.",
"Washington, D.C. is the capital of the United States.",
"Capital punishment has existed in the United States since before it was a country.",
]
response = rerank(
model="azure_ai/rerank-english-v3.0",
query=query,
documents=documents,
top_n=3,
)
print(response)
LiteLLM 提供了一个兼容 Cohere API 的 /rerank 端点用于重排序调用。
设置
将以下内容添加到您的 litellm 代理配置文件 config.yaml 中
model_list:
- model_name: Salesforce/Llama-Rank-V1
litellm_params:
model: together_ai/Salesforce/Llama-Rank-V1
api_key: os.environ/TOGETHERAI_API_KEY
- model_name: rerank-english-v3.0
litellm_params:
model: azure_ai/rerank-english-v3.0
api_key: os.environ/AZURE_AI_API_KEY
api_base: os.environ/AZURE_AI_API_BASE
启动 litellm
litellm --config /path/to/config.yaml
# 运行在 http://0.0.0.0:4000
测试请求
curl http://0.0.0.0:4000/rerank \
-H "Authorization: Bearer sk-1234" \
-H "Content-Type: application/json" \
-d '{
"model": "rerank-english-v3.0",
"query": "What is the capital of the United States?",
"documents": [
"Carson City is the capital city of the American state of Nevada.",
"The Commonwealth of the Northern Mariana Islands is a group of islands in the Pacific Ocean. Its capital is Saipan.",
"Washington, D.C. is the capital of the United States.",
"Capital punishment has existed in the United States since before it was a country."
],
"top_n": 3
}'