Crafting a LangChain Prompt template is a piece of cake; once you understand how to craft and use it, it will open numerous possibilities to make user-specific, industry-specific apps and even beyond.
A prompt template is an outline provided to the language model to behave the way we want it to behave depending on situations by providing a structure. These prompt templates are basic structures that need to be filled in by us to structure them further so that users can get what we want them to get.
In this blog, you will learn how to craft a simple prompt template and its building blocks. Later, in this blog, you are going to learn how to craft a few-shot prompt template, using which you can do better prompt engineering by instructing the models through various examples of what you expect from it.
So without any delay, let’s begin the show!
How to Craft a Simple LangChain Prompt Template
First off, to craft a LangChain prompt template, ensure you have the LangChain Library installed. To install the LangChain Library, use the below command.
pip install langchain
Now you need to import the prompt template module, so import it using the below command.
from langchain import PromptTemplate
A basic prompt template contains two blank spaces.
- 1. Input variables section
- 2. Template section
This is what it looks like.
PromptTemplate(
input_variables=["Input Variables are mentioned here"],
template="this is where our instructions are written {Input Variables mentioned here again}")
Let’s see how it is used with an example.
personality_template = PromptTemplate(
input_variables=["Personality"],
template="Tell me a fun fact about {Personality}")
The above LangChain Prompt template is designed to get a fun fact from the language model about the personality that will be sent by the user, let’s say Elon Musk. Let’s call this template personality_template by assigning it to the personality_template variable and see how it will appear in the language model when we pass Elon Musk as an input.
To check how it appears on the model .format is used while passing the inputs of users’ own choice; in our case, we are going to pass Elon Musk.
prompt = personality_template.format(Personality="Elon Musk")
print(prompt)
Output We Got:
Tell me a fun fact about Elon Musk
As you saw above, it only shows us how it is going to appear on the model. Prompts don’t work alone; they need to be chained to an LLM to do that.
Let’s see another example where we have more than one input variable. This time, let’s make a prompt template separately by defining a text string.
story = "write a story on {animal} set in {era} "
story_template = PromptTemplate(
input_variables=["animal", "era"],
template= story)
You can use as many input variables as you want in a prompt template; just add them all to the list, separating them with a comma. Now let’s see how this will be seen by a language model by passing input variables using the same .format method and printing it.
story_template.format(animal= "rabbit", era = "Renaissance")
Output:
write a story on rabbit set in Renaissance
LangChain Prompt Template: Few Shot Prompting
In the LangChain Prompt Template, this is how a few shot prompt template is crafted since these prompt templates are different than before, so we need to import few shot prompt template’s modules too along with the prompt template’s module.
from langchain import PromptTemplate
from langchain import FewShotPromptTemplate
Now, a few shot prompts’ basic working is based on examples, so let’s give some examples. We are going to make an English teacher’s prompt template; this is how we can give examples.
Our_examples = [ {"input": "What is the plural form of 'book'?", "output": "books"}, {"input": "What is the past tense of 'eat'?", "output": "ate"}, {"input": "What is the present participle of 'listen'?", "output": "listening"}, {"input": "What is the third person singular form of 'run'?", "output": "runs"}, {"input": "What is the comparative form of 'good'?", "output": "better"} ]
Note: Few-shot prompting is named as such because we train our Language Model (LLM) to learn specific tasks or behaviors by providing it with examples. Therefore, it’s crucial to pay extra attention while crafting these examples
We made a list of dictionaries with key-value pairs of input and output. We don’t need to use the same input-output format; you can replace it according to your own choice or use cases, for example, question and answer synonyms and antonyms, greetings and goodbyes, or in any way you want.
After creating a list of examples, we define a text string as a template, as usual. This template then serves as a blueprint for creating a prompt template. This prompt template will map to and imply the examples we created above.
Our_template = """ Answer the following questions: {input} ||| {output} """
The | | | symbol is used just to beautify the output process. Don’t get confused by it; you can use anything you want. You can replace it with the output is {output} or the answer to the query is {output}, and so on Just remember that this line will repeat itself and it depends on you How do you want your output to be displayed?
This template will later be used within the Few-Shot Prompt’s template, i.e., as a template within a template. Using the above text string template, first, we create a prompt template as shown below.
example_prompt_template = PromptTemplate(
input_variables=["input", "output"],
template=Our_template)
In the above prompt template, as you can see in the template, our_template’s input and output are the same as in the example. This is because both the examples list that we created earlier and this prompt template that we created using a text string template are going to be part of a Few Shot Prompt template which we are going to create below.
# Create the prompt Few Shot Prompt template
prompt_template = FewShotPromptTemplate(
examples = Our_examples,
example_prompt = example_prompt_template,
prefix = "I'm an English teacher and I'm here to help with your language learning.",
suffix = "Let's practice some more questions!",
input_variables = ["input"] )
In the above structure, we have first assigned our examples to examples and then our entire simple prompt template to example_prompt.
The prefix is what comes before the examples in the text that will be passed to the LLM, and the suffix is the text that is going to come after the text.
This is how it will appear to the LLM, which will be our current output as we are not currently invoking the LLM, but instead printing to observe the output, which may later be used as input for the LLM
- Prefix
- Examples
- Suffix
First our prefix will appear, followed by our given examples, and then the suffix.
Let’s print it using .format to see how our prompt looks when the input is passed.
# Generate a prompt
prompt = prompt_template.format(input="What is the present tense of 'read'?")
print(prompt)
Output:
I'm an English teacher and I'm here to help with your language learning.
Answer the following questions: What is the plural form of 'book'? ||| books
Answer the following questions: What is the past tense of 'eat'? ||| ate
Answer the following questions: What is the present participle of 'listen'? ||| is listening
Answer the following questions: What is the third person singular form of 'run'? ||| runs
Answer the following questions: What is the comparative form of 'good'? ||| better
Let's practice some more questions!
Our input, “What is the present tense of read?” doesn’t appear, as it will go to LLM when we use our template and run it using Langchain’s chain.
The Few Shot Prompt Template is another tool that can be used in the LangChain Prompt Template to craft a LangChain Prompt Template according to our needs.
Conclusion:
Congratulations! Now that you’ve learned how to craft a LangChain prompt template, I hope our blog has been helpful to you. To read more blogs on LangChain, like this one on the LangChain prompt template, please visit our website. Additionally, explore our other blogs to learn how you can connect them with a simple chain and create a history teacher using the LangChain prompt template. Click here for more information. Don’t forget to give us your valuable feedback so that we can continue to improve.