Locked learning resources

Join us and get access to thousands of tutorials and a community of expert Pythonistas.

Unlock This Lesson

Locked learning resources

This lesson is for members only. Join us and get access to thousands of tutorials and a community of expert Pythonistas.

Unlock This Lesson

Getting to Know Prompt Templates

00:00 In the previous lesson, you’ve seen that LangChain uses custom classes to represent messages, different types of messages. Now, in this lesson, you’ll take a step into abstraction and you’ll be looking at prompt templates, which allow you to create dynamic prompts that you can reuse and that help to put some consistency into the strings that you’re working with, that you’re sending off to an LLM to get a response from.

00:25 And with prompt templates, you can well create templates that then you can dynamically fill with changing content. Okay, let’s head over to the REPL and try this out in practice.

00:36 So I’ll start off by defining a template string.

00:41 And in here I’m just going to, “You’re an expert on” and then, I’ll put in curly braces, which are placeholders for this template prompts. You’ll be familiar with those if you worked with string interpolation in Python before.

00:57 Python also uses these curly braces for interpolation in the .format() string method, and also for f-strings, “You’re an expert on a certain topic”, blah, blah, blah.

01:08 There’s some space here. And then you will put in another placeholder that’s going to hold the context,

01:15 and then a question. So you’re basically defining a template that is currently just a string that shows a couple of placeholders. And in this case, that would be a topic, a context, and a question.

01:31 And now you can create a prompt template, but first you’ll need to import that. So from langchain.prompts, import PromptTemplate.

01:43 And now I’m just going to say prompt_template equals PromptTemplate.

01:52 So instantiate a PromptTemplate object by calling its class method from_template() and then passing in the template string.

02:03 So these are convenience methods, class methods that allow for alternative instantiating. And when you pass in the string, then it creates a prompt template for you from the template through is not defined.

02:15 Yes, because I called it template_string. And I also called my prompt_templateprompt_templat, let me fix that. Okay, prompt_template.

02:27 What does it look like? Similar as before, you can see that it has the __repr__ returns you a class name. And then you’ll see this interesting input_variables that clearly collected ‘context’, ‘question’, and ‘topic’.

02:42 So all of the words that you put into these curly braces placeholders, and then it’s got a couple of other things. And then it gives you also the template text exactly.

02:52 So again, you have a custom class here that allows you to do a lot of things later on. For example, just to show how LangChain then eventually uses this, you can fill this template.

03:05 Going to call it the filled_prompt. We’ll be using the prompt_template and then calling .format() on it. And then I pass in topic there.

03:18 So this is going to be an expert on user feedback and context will be, “I love it here!”

03:32 And the question, “Any positive reviews?”

03:44 And now LangChain fills this prompt using string interpolation, right? filled_prompt looks like this: “You’re an expert on User Feedback, “I love it here!!!”, “Any positive reviews?” And this is like the constructed text.

03:57 This is now just a string.

04:01 So once you call .format() on it, then it converts it to a string where it interpolates the values in there and then yeah, constructs the string object.

04:09 And now you can pass this again to the chat model. You can say chat_model.invoke(), and then pass in the filled_prompt. And as a response,

04:26 well again, an AI message that I guess reflects that I didn’t construct this prompt very well.

04:33 Should we try that again? Okay. Let’s look at the prompt template once more. And I’m going to say, Here is a user review:

04:47 Please answer the following question

04:53 concisely in one word: Okay, so there’s a different template string. And now let me again, construct the filled_prompt.

05:08 Oh no, that’s still the content of the old filled_prompt. Ah yes. It’s because I’m still formatting the old prompt template. So I first need to recreate the PromptTemplate object with the updated template string.

05:23 Okay, there it is. Fill the prompt again. And now filled_prompt has a bit more context. So “Here is a user review: I love it here”. And now let’s try to send that again to the chat model.

05:40 Let’s see what we get as a response. ‘Yes.’ Well that’s a beautiful response, right? Okay, so of course, as you can see, I had to redo a bunch of things to improve my template, but this is actually a pretty good process, right?

05:56 If, if this wasn’t a script, I just had to go to where I’m defining the template string and add these two instructions basically to the template string, and the rest of the code can stay the same, right?

06:06 I just rerun everything there. The inputs were still the same. Let’s assume they’re coming from somewhere else. Like maybe the question comes from a user or the context comes from a database, right?

06:16 So this is a bunch of user reviews and you’re throwing in, I don’t know, maybe a hundred or, or maybe just one at a time. But you’re throwing in different, different context and there may be different questions.

06:27 And with this setup, you have a template that you can edit in one spot like I just did. And that can otherwise handle a lot of different requests and a lot of different questions.

06:38 So that’s some of the power that you get with using prompt templates. And there are specific types of prompt templates that as well that go together with the types of messages that you’ve seen before.

06:48 And let’s look at those in the next lesson.

Avatar image for Odysseas Kouvelis

Odysseas Kouvelis on May 24, 2025

I have a feeling that langchain.prompts is deprecated:

from langchain.prompts import PromptTemplate

Currently (as indicated by the requirements.txt specifying LangChain > 0.3), the documentation suggests:

from langchain_core.prompts import ChatPromptTemplate, PromptTemplate

Additionally, I was under the impression that using a chat model calls for the use of ChatPromptTemplate, whereas PromptTemplate is intended for completion models.

What is the intuition behind choosing PromptTemplate over ChatPromptTemplate?

Become a Member to join the conversation.