跳转到内容

向量记忆

注意:此内存示例已弃用,推荐使用更新且更灵活的 Memory 类。请参阅最新文档

向量记忆模块使用向量搜索(由向量数据库支持)来根据用户输入检索相关的对话项。

本笔记本向您展示如何使用 VectorMemory 类。我们将演示如何使用其各个功能。向量内存的典型用例是作为聊天消息的长期存储。您可以

VectorMemoryIllustration

这里我们初始化一个原始记忆模块并演示其功能——存储和检索聊天消息对象。

  • 请注意,retriever_kwargs 的参数与您在 VectorIndexRetriever 上指定的参数或来自 index.as_retriever(..) 的参数相同。
from llama_index.core.memory import VectorMemory
from llama_index.embeddings.openai import OpenAIEmbedding
vector_memory = VectorMemory.from_defaults(
vector_store=None, # leave as None to use default in-memory vector store
embed_model=OpenAIEmbedding(),
retriever_kwargs={"similarity_top_k": 1},
)
from llama_index.core.llms import ChatMessage
msgs = [
ChatMessage.from_str("Jerry likes juice.", "user"),
ChatMessage.from_str("Bob likes burgers.", "user"),
ChatMessage.from_str("Alice likes apples.", "user"),
]
# load into memory
for m in msgs:
vector_memory.put(m)
# retrieve from memory
msgs = vector_memory.get("What does Jerry like?")
msgs
[ChatMessage(role=<MessageRole.USER: 'user'>, content='Jerry likes juice.', additional_kwargs={})]
vector_memory.reset()

现在让我们尝试重置并重试。这次,我们将添加一条助手消息。请注意,用户/助手消息默认是捆绑在一起的。

msgs = [
ChatMessage.from_str("Jerry likes burgers.", "user"),
ChatMessage.from_str("Bob likes apples.", "user"),
ChatMessage.from_str("Indeed, Bob likes apples.", "assistant"),
ChatMessage.from_str("Alice likes juice.", "user"),
]
vector_memory.set(msgs)
msgs = vector_memory.get("What does Bob like?")
msgs
[ChatMessage(role=<MessageRole.USER: 'user'>, content='Bob likes apples.', additional_kwargs={}),
ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='Indeed, Bob likes apples.', additional_kwargs={})]