Machine Learning Spot

What is LangChain LCEL? Discover 3 Exciting Use Cases

LangChain LCEL feature Image

So, let’s see what is LangChain LCEL. LangChain Expression Language is just a new simplified way of constructing LangChain chains. You can consider LangChaion LCEL a shorthand for writing instructions for your computer to interact with large language models (LLMs).

If you’re unfamiliar with chains, LangChain is named as such. “Lang” refers to language, and “chain” reflects how LangChain allows you to connect different components. To use an LLM with a prompt template or any other element, you need to chain them together. LangChain LCEL makes this process concise.

In this blog, we are studying LangChain LCEL. Although I will explain the difference between traditional and LCEL approaches, if you don’t know anything about LangChain chains, it is highly recommended that you read the post on LangChain chains first.

LangChain LCEL: What’s New

Let’s see what we had in traditional chains and what do we have here

LangChain’s Traditional Chains

In a traditional and typical approach, for each type of chain construction, we needed to import modules for that typical type individually. For example, we needed to import LLMChain, SimpleSequentialChain, etc.

Similarly, in the traditional approach, we explicitly create each chain object using classes like LLMChain.

In the conventional approach, prompt templates define the template string and specify input variables individually.

Lastly, building complex workflows with conditional branching requires additional coding effort in the traditional approach.

LangChain LCEL

Simplification: With LCEL, you can write more concise and readable chains, and there is no need to import different modules.

Efficiency: LCEL offers a single function to create various chain types, reducing boilerplate code.

Clarity: LCEL simplifies defining prompts by combining template strings and input variables.

Flexibility: LCEL provides built-in support for data processing steps and conditional logic within chains.

Chain Creation with LCEL

Let’s create a simple LangChain LCEL chain. The LLM we will be using for it will be by Fireworks, so you should have its API to make its API by signing up and visiting their API key’s page.

Let’s begin with installations.

pip install -qU langchain-fireworks

Let’s use our API to bring Fireworks into play to chain a LLM.

import getpass
import os

os.environ["FIREWORKS_API_KEY"] = getpass.getpass()

The above lines of code will ask you to provide an API key to establish a connection with Fireworks. But which model do we want to establish a connection with? To specify that, the lines of code below are used.

from langchain_fireworks import ChatFireworks

model = ChatFireworks(model="accounts/fireworks/models/mixtral-8x7b-instruct")

We have set up our LLM; after that, we need to set up a prompt template and an output parser to chain all of them together.

from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_template("tell me an interesting fact about {topic} in 30 words")
output_parser = StrOutputParser()

Lastly, let’s chain everything together. We must chain the prompt template, fireworks LLM, and the output parser. The prompt template is saved into the prompt variable, the Fireworks LLM’s chat model is saved in “model,” and the output parser is saved into “output_parser.” To chain in LCEL, we need to write the pipe bar and treat the pipe bar as a chain, so add a pipe bar between each variable.

chain = prompt | model | output_parser
fact = input("type the topic's name about which you want to hear an interesting fact \n\n\n")
chain.invoke({"topic": fact})

Output:

The blue whale is the largest animal ever known to have existed, reaching lengths of up to 100 feet and weights of as much as 200 tons. Its heart alone can weigh as much as an automobile!

The Three Exciting Use Cases of LangChain LCEL

Let’s look at three use cases that will familiarize us with LangChain LCEL and open our minds to creating more exciting things with LCEL ( LangChain Expression Language). We will keep everything the same, and we will replace our prompt template. Then, invoke it again.

Short Story Generator

This will generate a short story. It asks the user to give a theme and generates a short story quickly. This is the first use case I am showing, as this one is the simplest compared to the other examples of LangChain LCEL.

prompt = ChatPromptTemplate.from_template("write a short story about {theme} in 100 words")


chain = prompt | model | output_parser

theme = input("Enter a theme for your short story: \n\n\n")
print(chain.invoke({"theme": theme}))

Output:

As the rainy season arrived, the earth seemed to come alive. The once parched soil drank in the life-giving water, and vibrant green shoots soon pushed their way through the dirt. Children played in the puddles, their laughter echoing through the streets. The air was filled with the scent of damp earth and blooming flowers. Farmers rejoiced as their crops grew plump and healthy. The rainy season brought not just life, but also joy and unity. Despite the occasional inconvenience, everyone appreciated its beauty and necessity. The rainy season was a reminder that even in the midst of challenges, there was always renewal and hope.

A Personal History Teacher

In this example, we create a personal history teacher. We give it a topic and a period, and then it tells us about what special happened during that particular period to the specific place or thing we provided.

Let’s explore this 2nd use case of LangChain LCEL

prompt = ChatPromptTemplate.from_template("You are a helpful history teacher, and when a topic is provided to you, you talk about {topic} within the period {period} keep it limited to 50 words")
output_parser = StrOutputParser()

chain = prompt | model | output_parser

print(chain.invoke({"topic": "New York", "period":"1947"}))

Output:

In 1947, New York saw the introduction of the "I Love New York" campaign, aimed at boosting tourism. The city also experienced the beginning of the post-war boom, with rapid growth in population and the economy. The United Nations headquarters, completed in 1952, further solidified New York's role in international affairs.

So, it depends on our creativity in how we use LangChain LCEL. We just need to have components, chain them using the pipe bar, and invoke them. Let’s look at another example of practicing LangChain LCEL.

LangChain LCEL: Personalised Inquiry Response Generator

LangChain LCEL can also create automated email responses tailored to specific customer inquiries. This can save time for customer service teams and ensure consistent and personalized communication.

prompt = ChatPromptTemplate.from_template("Write a polite and professional response to the following customer inquiry: {inquiry} if its in the form of email then reply in email and if it's in other form then reply in that form my name is Talal and my phone number is +923163090794. if inquiry is in the form of email keep subject along with email response short and professional")
output_parser = StrOutputParser()

# Chain the components together
chain = prompt | model | output_parser

# Get user input and invoke the chain
inquiry = input("Enter the customer inquiry: \n\n\n")
print(chain.invoke({"inquiry": inquiry}))

The output this generates is beautiful, but to maintain the flow and keep it easier to read by everyone, I will add complete production in a doc, including the email I added into the input field.

Conclusion:

Congratulations! Now you know what LCEL is and what to expect from it. There’s more to LCEL, and it makes it easier for us to do many things that used to take a lot of coding effort. I recommend you read their documentation on this page.

Furthermore, this blog is just one part of my LangChain blog series. You can check out my other blogs in the ML guide section. Also, feel free to reach out if you have any suggestions or feedback.

Liked the Post?  You can share as well

Facebook
Twitter
LinkedIn

More From Machine Learning Spot

Get The Latest AI News and Insights

Directly To Your Inbox
Subscribe

Signup ML Spot Newsletter

What Will You Get?

Bonus

Get A Free Workshop on
AI Development