OpenRouter
LiteLLM支持来自OpenRouter的所有文本/聊天/视觉模型。
使用方法
import os
from litellm import completion
os.environ["OPENROUTER_API_KEY"] = ""
os.environ["OR_SITE_URL"] = "" # 可选
os.environ["OR_APP_NAME"] = "" # 可选
response = completion(
model="openrouter/google/palm-2-chat-bison",
messages=messages,
)
OpenRouter完成模型
🚨 LiteLLM支持所有OpenRouter模型,发送model=openrouter/<你的openrouter模型>以将其发送到open router。查看所有openrouter模型这里
| 模型名称 | 函数调用 |
|---|---|
| openrouter/openai/gpt-3.5-turbo | completion('openrouter/openai/gpt-3.5-turbo', messages) |
| openrouter/openai/gpt-3.5-turbo-16k | completion('openrouter/openai/gpt-3.5-turbo-16k', messages) |
| openrouter/openai/gpt-4 | completion('openrouter/openai/gpt-4', messages) |
| openrouter/openai/gpt-4-32k | completion('openrouter/openai/gpt-4-32k', messages) |
| openrouter/anthropic/claude-2 | completion('openrouter/anthropic/claude-2', messages) |
| openrouter/anthropic/claude-instant-v1 | completion('openrouter/anthropic/claude-instant-v1', messages) |
| openrouter/google/palm-2-chat-bison | completion('openrouter/google/palm-2-chat-bison', messages) |
| openrouter/google/palm-2-codechat-bison | completion('openrouter/google/palm-2-codechat-bison', messages) |
| openrouter/meta-llama/llama-2-13b-chat | completion('openrouter/meta-llama/llama-2-13b-chat', messages) |
| openrouter/meta-llama/llama-2-70b-chat | completion('openrouter/meta-llama/llama-2-70b-chat', messages) |
传递OpenRouter参数 - 转换、模型、路由
将transforms、models、route作为参数传递给litellm.completion()
import os
from litellm import completion
os.environ["OPENROUTER_API_KEY"] = ""
response = completion(
model="openrouter/google/palm-2-chat-bison",
messages=messages,
transforms = [""],
route= ""
)