Deepseek
我们支持所有Deepseek模型,只需在发送完成请求时将deepseek/设置为前缀
API密钥
# 环境变量
os.environ['DEEPSEEK_API_KEY']
示例用法
from litellm import completion
import os
os.environ['DEEPSEEK_API_KEY'] = ""
response = completion(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
示例用法 - 流式处理
from litellm import completion
import os
os.environ['DEEPSEEK_API_KEY'] = ""
response = completion(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "hello from litellm"}
],
stream=True
)
for chunk in response:
print(chunk)
支持的模型 - 所有Deepseek模型均受支持!
我们支持所有Deepseek模型,只需在发送完成请求时将deepseek/设置为前缀
| 模型名称 | 函数调用 |
|---|---|
| deepseek-chat | completion(model="deepseek/deepseek-chat", messages) |
| deepseek-coder | completion(model="deepseek/deepseek-coder", messages) |