Caching in Python With lru_cache

There are many ways to achieve fast and responsive applications. Caching is one approach that, when used correctly, makes things much faster while decreasing the load on computing resources.

Python’s functools module comes with the @lru_cache decorator, which gives you the ability to cache the result of your functions using the Least Recently Used (LRU) strategy. This is a simple yet powerful technique that you can use to leverage the power of caching in your code.

In this video course, you’ll learn:

  • What caching strategies are available and how to implement them using Python decorators
  • What the LRU strategy is and how it works
  • How to improve performance by caching with the @lru_cache decorator
  • How to expand the functionality of the @lru_cache decorator and make it expire after a specific time

By the end of this video course, you’ll have a deeper understanding of how caching works and how to take advantage of it in Python.

What’s Included:

Downloadable Resources:

About Christopher Trudeau

Christopher has a passion for the Python language and writes for Real Python. He is a consultant who helps advise organizations on how to improve their technical teams.

» More about Christopher

Each tutorial at Real Python is created by a team of developers so that it meets our high quality standards. The team members who worked on this tutorial are:

← Browse All Courses