06. Conversation SummaryMemory

ConversationSummaryMemory

Now a little more complicated type of memory ConversationSummaryMemory Let's look at how to use.

This type of memory is over time Summary of conversation Generate. This can be useful for compressing information from conversations over time.

Conversation summary memory summarizes the conversation during the conversation Save current summary to memory To.

You can then use this memory to insert a conversation summary so far into the prompt/chain.

This memory is most useful for long conversations that can take up too many tokens by keeping past message records at the prompt.

ConversationSummaryMemory Generate.

Copy

# API KEY A configuration file for managing environment variables
from dotenv import load_dotenv

# API KEY Load information
load_dotenv()

Copy

True

Copy

from langchain.memory import ConversationSummaryMemory
from langchain_openai import ChatOpenAI

memory = ConversationSummaryMemory(
    llm=ChatOpenAI(temperature=0), return_messages=True)

Have multiple conversations saved.

Copy

Check the history of stored memory.

You can see a compressed summary of all previous conversations.

Copy

Copy

Let's save an additional conversation to exceed the 200 token limit.

Copy

Check for saved conversation. The summary does not go on for the most recent 1 conversation, but the previous conversation is stored as a summary.

Copy

Copy

Last updated