ReplicateLLM
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:107
使用的LLM实现复制
BaseLLM
new ReplicateLLM(
init?):ReplicateLLM
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:115
Partial<ReplicateLLM> & object
ReplicateLLM
BaseLLM.constructor
模型:
"Llama-2-70b-chat-old"|"Llama-2-70b-chat-4bit"|"Llama-2-13b-chat-old"|"Llama-2-13b-chat-4bit"|"Llama-2-7b-chat-old"|"Llama-2-7b-chat-4bit"|"llama-3-70b-instruct"|"llama-3-8b-instruct"
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:108
chatStrategy
Section titled “chatStrategy”聊天策略:
ReplicateChatStrategy
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:109
temperature:
number
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:110
topP:
number
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:111
optionalmaxTokens:number
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:112
replicateSession
Section titled “replicateSession”replicateSession:
ReplicateSession
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:113
get metadata():
object
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:140
object
模型:
"Llama-2-70b-chat-old"|"Llama-2-70b-chat-4bit"|"Llama-2-13b-chat-old"|"Llama-2-13b-chat-4bit"|"Llama-2-7b-chat-old"|"Llama-2-7b-chat-4bit"|"llama-3-70b-instruct"|"llama-3-8b-instruct"
temperature:
number
topP:
number
maxTokens
Section titled “maxTokens”最大令牌数:
undefined|number
contextWindow
Section titled “contextWindow”contextWindow:
number
分词器:
undefined=undefined
structuredOutput
Section titled “structuredOutput”结构化输出:
boolean=false
BaseLLM.metadata
mapMessagesToPrompt()
Section titled “mapMessagesToPrompt()”mapMessagesToPrompt(
messages):object
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:152
ChatMessage[]
object
prompt:
string
systemPrompt
Section titled “systemPrompt”系统提示:
undefined|MessageContent
mapMessagesToPromptLlama3()
Section titled “mapMessagesToPromptLlama3()”mapMessagesToPromptLlama3(
messages):object
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:178
ChatMessage[]
object
prompt:
string
systemPrompt
Section titled “systemPrompt”系统提示:
undefined=undefined
mapMessagesToPromptA16Z()
Section titled “mapMessagesToPromptA16Z()”mapMessagesToPromptA16Z(
messages):object
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:204
ChatMessage[]
object
prompt:
string
systemPrompt
Section titled “systemPrompt”系统提示:
undefined=undefined
mapMessageTypeA16Z()
Section titled “mapMessageTypeA16Z()”mapMessageTypeA16Z(
messageType):string
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:218
messageType
Section titled “messageType”MessageType
string
mapMessagesToPromptMeta()
Section titled “mapMessagesToPromptMeta()”mapMessagesToPromptMeta(
messages,opts?):object
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:231
ChatMessage[]
是否包含Bos?
Section titled “withBos?”boolean
replicate4Bit?
Section titled “replicate4Bit?”boolean
是否包含换行符?
Section titled “withNewlines?”boolean
object
prompt:
string
systemPrompt
Section titled “systemPrompt”系统提示:
undefined|MessageContent
chat()
Section titled “chat()”chat(
params):Promise<AsyncIterable<ChatResponseChunk,any,any>>
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:307
LLMChatParamsStreaming
Promise<AsyncIterable<ChatResponseChunk, any, any>>
BaseLLM.chat
chat(
params):Promise<ChatResponse<object>>
定义于:.build/typescript/packages/providers/replicate/src/llm.ts:310
LLMChatParamsNonStreaming
Promise<ChatResponse<object>>
BaseLLM.chat