Skip to main content

NLP Cloud

LiteLLM 支持 NLP Cloud 上的所有 LLM。

API Keys

import os 

os.environ["NLP_CLOUD_API_KEY"] = "your-api-key"

示例用法

import os
from litellm import completion

# 设置环境变量
os.environ["NLP_CLOUD_API_KEY"] = "your-api-key"

messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="dolphin", messages=messages)
print(response)

流式传输

只需在调用 completion 时设置 stream=True

import os
from litellm import completion

# 设置环境变量
os.environ["NLP_CLOUD_API_KEY"] = "your-api-key"

messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="dolphin", messages=messages, stream=True)
for chunk in response:
print(chunk["choices"][0]["delta"]["content"]) # 与 openai 格式相同

非 dolphin 模型

默认情况下,LiteLLM 会将 dolphinchatdolphin 映射到 NLP Cloud。

如果你尝试调用任何其他模型(例如 GPT-J、Llama-2 等)与 NLP Cloud,只需将其设置为自定义 LLM 提供商。

import os
from litellm import completion

# 设置环境变量 - [可选] 替换为你的 NLP Cloud 密钥
os.environ["NLP_CLOUD_API_KEY"] = "your-api-key"

messages = [{"role": "user", "content": "Hey! how's it going?"}]

# 例如,调用 NLP Cloud 上的 Llama2
response = completion(model="nlp_cloud/finetuned-llama-2-70b", messages=messages, stream=True)
for chunk in response:
print(chunk["choices"][0]["delta"]["content"]) # 与 openai 格式相同
优云智算