Machine Learning Spot

How to Implement LangChain Memory for Contextual AI Conversations

LangChain Memory

Want to know about LangChain memory? This blog is dedicated to this!

Have you ever wondered how ChatGPT can answer you almost the same way humans do? LLM is a factor, but a human being can also remember what we were and are talking about.

Suppose you are asking ChatGPT to help you fill out a form, and you tell it your name, but it keeps forgetting, and you have to give it with every prompt. It would be frustrating, right? Or suppose your chatbot, while chatting with a customer, forgets the details of the customer that we just mentioned. Ah! The customer will be so disappointed and may never come back to you.

To solve these and a myriad of similar problems, we incorporate a memory, and LangChain provides us this facility of incorporating a memory.

This blog deals with LangChain memory and is going to help you master the ConversationBufferMemory ChatMessageHistory and ConverstionChains of LangChain memory, making you well-equipped to make much better LLM products.

So without any delay, let’s jump into the coding arena

Installing The Required Packages

We need to install two packages for this blog: Langchain and Langchain_openAI. Below are the commands that will be used.

%pip install langchain --quiet

%pip install langchain_openai

Importing Every Module Required

Below, I have written every module that we need to import in this blog, but still, I will keep mentioning what I am importing so that you can get an idea of what is important to be imported, and when it is important. This will also reduce your chances of errors while learning from this blog on LangChain Memory.

from langchain_openai import OpenAI
from langchain_core.prompts import PromptTemplate
from langchain.chains import LLMChain
from langchain.memory import ConversationBufferMemory
from langchain.memory import ChatMessageHistory
from langchain.chains import ConversationChain

import os

Establishing a Connection Between LangChain Memory and OpenAI

To establish a connection between your LangChain memory and OpenAI, you need an OpenAI API key, which you can obtain from here. Then, simply set up the environment using the code below and initialize the OpenAI LLM as ‘llm’. Be careful about how you initialize it; did you use capital or small letters? I have used small throughout my code.

import os
os.environ["OPENAI_API_KEY"] = "My API" #replace My API with your API

# Initialize the LLM

llm = OpenAI(temperature=0.7)#Temerature indicates how creative you want your llm to be

Conversation Buffer Memory: A Memory That Manages Itself

It’s a memory that manages itself as soon as it is filled; it clears itself, but it has enough space to hold a long context of your chat, and it is simple to use. Let’s start exploring chat buffer memory and begin our journey of Langchain memory.

from langchain.memory import ConversationBufferMemory #You must have imported it above already

czech = ConversationBufferMemory()
 #Since we are making a Czech language instructor so intitializing our memory as czech you can also call it memory

czech.chat_memory.add_user_message("Let's Learn Czech! I will speak english your job is to translate it in czech")

czech.chat_memory.add_ai_message("Okay I will help you learn! write in english I will translate in czech")

czech.chat_memory.add_user_message("Hello!")

czech.chat_memory.add_ai_message("Ahoi!")

After initializing the conversation buffer memory, we can craft our chat as we have done above to run it. There are myriad ways that we can opt to chain it with anything using the power of LangChain. We can use LangChain memory in any way we want!
For the sake of convenience, let’s try out running it with the two most simple chains. One of them is the conversation chain, so let’s begin with it.

Running Our Buffer Memory Using The Conversation Chain

Running our chat and passing it from buffer memory to an LLM becomes much easier when we use this chain, even easier than a simple LangChain chain. This LangChain chain is specially designed to work along with LangChain buffer memory, making managing LangChain memory much easier and faster.

This is how we will chain our chat using the conversation chain.

conversation = ConversationChain(llm= llm, memory= czech)

LLM is chained with the LLM instance that we created earlier using OpenAI, and the memory in the conversation chain is chained with the buffer memory instance that we created to manage our language memory. This buffer memory instance contains each chat that we crafted to learn the Czech language.

Now it is time to load our chat in LLM and use it to generate a response. We can easily do it using the predict function.

response = conversation.predict(input = "How?") #In input we have written our query

print(response) # priniting our response so we can see the prediction.
Output:
 Jak?

So it was as simple as that. This is how the conversation chain makes it super easy to use LangChain buffer memory.

Running Our Buffer Memory Using LLM Chain

The beauty of LangChain is that multiple options are available for chains that can perform various tasks while chaining different components.
Now we will do the same using the LangChain LLM chain that we did before using the LangChain Conversation Chain.

Here we will pass a prompt template that we will create, add a chat to buffer memory, and then run it The benefit of using the LLM chain to use LangChain memory is that here we can also pass a prompt template that can later guide the chat.
Okay, so let’s start making the LLM chain.

#Importing Everything required to run this LLM chain
from langchain_openai import OpenAI
from langchain_core.prompts import PromptTemplate
from langchain.chains import LLMChain
from langchain.memory import ConversationBufferMemory

You already used the OpenAI Module and ConversationBufferMemory above while using the conversation chain; now in LLM Chain, the prompt template module and LLM Chain are new, which will be used below.

# Define the prompt template
template = """You are a nice chatbot having a conversation with a human.

Previous conversation:
{chat_history}

New human question: {question}
Response:"""

prompt = PromptTemplate.from_template(template) #String variable template passed to prompt template module

We defined a string template, which is then used to craft a prompt template. The words in curly braces like {chat_history} and {question} are places that will be replaced with our chat and questions, respectively. We write input variables in curly braces. This string is then passed to the prompt template module to craft a prompt template.

# Initialize the Conversation buffer memory
 of LangChain memory
memory = ConversationBufferMemory(memory_key="chat_history")

# Add the initial messages to the memory
memory.chat_memory.add_user_message("Let's Learn Czech! I will speak english your job is to translate it in czech")
memory.chat_memory.add_ai_message("Okay I will help you learn! write in english I will translate in czech")
memory.chat_memory.add_user_message("Hello!")
memory.chat_memory.add_ai_message("Ahoi!")

We initialized conversation buffer memory as memory and added our chat the same way we did before, but here you can see we have initialized a memory key, i.e., chat_history. While using the conversation chain, we didn’t give any memory keys, so it was simply named history by default. You can find the same memory key written in our prompt template structure too.

Now it’s time to chain everything up!

#Connecting everything we made using LLM Chain of LAngChain
conversation = LLMChain(
llm=llm,
prompt=prompt,
verbose=True,
memory=memory
)

We had initialized our LLM as LLM using the OpenAI API before, crafted a prompt template, and initialized conversation buffer memory as memory before. Here in the LLM chain, we just told the chain what our prompt is, what our memory is, and what our LLM is. The verbose is there to show detailed output.

Let’s use our Langchain memory now to get the answer. This is done the same way as we did before.

# Make a prediction
response = conversation.predict(question="How?")

print(response)

Output:


> Entering new LLMChain chain...
Prompt after formatting:
You are a nice chatbot having a conversation with a human.

Previous conversation:
Human: Let's Learn Czech! I will speak english your job is to translate it in czech
AI: Okay I will help you learn! write in english I will translate in czech
Human: Hello!
AI: Ahoi!

New human question: How?
Response:

> Finished chain.
Jak?

ChatMessageHistory: Helps You Manage LangChain Memory

While talking about LangChain Memory, we can’t ignore the ChatMessageHistory Module.
It is not a buffer memory; it is just a module that helps you write your message history, which can then be passed to the buffer memory, and then we can use it.

The benefit of chatMessageHistory is that it has no limit compared to buffer memory, which has limits and empties itself as soon as

So first, we are going to write a chat:

from langchain.memory import ChatMessageHistory

history = ChatMessageHistory()

history.add_user_message("Let's Learn Czech! I will speak english your job is to translate it in czech")

history.add_ai_message("Okay I will help you learn! write in english I will translate in czech")
history.add_user_message("Hello!")
history.add_ai_message("Ahoi!")

Now let’s send it to buffer memory and use our LangChain memory using the conversation chain.

from langchain.chains import ConversationChain

# Initialize the ConversationBufferMemory with the history
memory = ConversationBufferMemory(chat_memory=history)

# Initialize the ConversationChain with the OpenAI model and the memory
conversation = ConversationChain(llm=llm, memory=memory)

# Now you can continue the conversation
response = conversation.predict(input="blue")

print(response)

Output:

Modrý

Conclusion:

This was our discussion on LangChain memory, but there is more to explore. LangChain Memory offers multiple ways to connect memories using various chains, making the possibilities endless. Furthermore, there are more types of LangChain buffer memory. If you want me to write about them as well, do let me know. And don’t forget to read our other blogs on LangChain.

Liked the Post?  You can share as well

Facebook
Twitter
LinkedIn

More From Machine Learning Spot

Get The Latest AI News and Insights

Directly To Your Inbox
Subscribe

Signup ML Spot Newsletter

What Will You Get?

Bonus

Get A Free Workshop on
AI Development