Join us and get access to thousands of tutorials and a community of expert Pythonistas.

Unlock This Lesson

This lesson is for members only. Join us and get access to thousands of tutorials and a community of expert Pythonistas.

Unlock This Lesson

Hint: You can adjust the default video playback speed in your account settings.
Hint: You can set your subtitle preferences in your account settings.
Sorry! Looks like there’s an issue with video playback 🙁 This might be due to a temporary outage or because of a configuration issue with your browser. Please refer to our video player troubleshooting guide for assistance.

Caching in Python With lru_cache (Summary)

Caching is an essential optimization technique for improving the performance of any software system. Understanding how caching works is a fundamental step toward incorporating it effectively in your applications.

In this video course, you learned:

  • What the different caching strategies are and how they work
  • How to use Python’s @lru_cache decorator
  • How to create a new decorator to extend the functionality of @lru_cache
  • How to measure your code’s runtime using the time module
  • What recursion is and how to solve a problem using it

The next step to implementing different caching strategies in your applications is looking at the cachetools module. This library provides several collections and decorators covering some of the most popular caching strategies that you can start using right away.

For more information on the concepts covered in this course, check out:

Download

Sample Code (.zip)

5.3 KB
Download

Course Slides (.pdf)

1.0 MB

00:00 In the previous lesson, I showed you how to augment the LRU cache to add time-based expiration. In this lesson, I’ll summarize the course and point you at some areas of further investigation.

00:13 You’ve seen how caching can make a big performance difference speed-wise. In fact, you saw a six order of magnitude difference when calculating the first thirty-six values of the Fibonacci sequence. This speed-up comes at the cost of memory.

00:26 Life is all about trade-offs. Early on in the course, I showed you several different caching policies but focused on LRU, the Least Recently Used policy.

00:36 This policy keeps things that were recently accessed, attempting to maximize the locality with a compromise on cost. Python provides an LRU cache through a decorator in the functools library.

00:48 As long as your function’s arguments are hashable, it can be cached with the addition of an import and a single line of code. And as decorators are just functions, you can compose and add to them by calling them inside of other functions.

01:01 You saw how to implement an LRU cache that also had a time expiration feature by calling the LRU decorator inside of your very own.

01:12 For more information on cache policies, you could see this Wikipedia article. If you’d like to learn more about functools, there’s a lesson inside of this longer course called Python Coding Interviews: Tips & Best Practices that goes into some commonly used parts of this library.

01:29 And if you’d like to learn more about decorators, this is a very detailed course with better explanations than you got in my little tangent. That’s all for me.

01:40 I hope you found this course useful.

Become a Member to join the conversation.