Join us and get access to thousands of tutorials and a community of expert Pythonistas.

Unlock This Lesson

This lesson is for members only. Join us and get access to thousands of tutorials and a community of expert Pythonistas.

Unlock This Lesson

Hint: You can adjust the default video playback speed in your account settings.
Hint: You can set your subtitle preferences in your account settings.
Sorry! Looks like there’s an issue with video playback 🙁 This might be due to a temporary outage or because of a configuration issue with your browser. Please refer to our video player troubleshooting guide for assistance.

Measuring Execution Time in the multiprocessing Testbed

In this lesson, you’ll extend your testbed to add some logging so you can trace how long it takes to calculate your result. You’ll measure the execution time with the time.time() function, which we’ll use to compare the single-threaded and multithreaded implementations of the same algorithm.

In the next lesson, you’ll take a look at the multiprocessing.Pool class and its parallel map implementation, which make it a lot easier to parallelize most Python code that is written in a functional style.

00:00 Before we move on, I want to extend this testbed a little bit more because I want to add some more logging so that we can actually trace how long it took to calculate this result.

00:09 So, what I’m going to do here—I’m going to take the start time before we apply the map operation,

00:18 and then I’m also going to measure the end time. What time.time() does—it just gives you a seconds-based timestamp as a float, right? So, what we’re going to do here is we’re going to print the time to completion

00:33 and we’re just going to calculate that as end - start. That’s going to give me the float in seconds that it took to run this piece of code here. Now when I save this, we can run this again. It’s going to very slowly process all of these records, one by one.

00:55 We’re simulating here that this would take up to 1 second and then it’s printing out, “Okay, this took seven seconds, and a little bit more,” which makes sense because we have seven records in here.

01:07 I’m just going to polish that a little bit more…

01:12 just to make sure we have this nicely formatted. So, we can use a format string here, and then run this again, do our timing. Now, we get the input data,

01:24 we get the processing as it happens—this is logging some stuff—and it tells us, “Hey, it took this long to calculate the result and here it is in seconds.” We can make this a little bit more nice, just limit it to two decimals and have a really nice output.

01:44 I often like to build stuff like that if I’m experimenting with something, and it really helps me get what I want from these experiments and from these analyses. All right.

01:53 So now, we’ve got everything we need here. We’ve got the strictly sequential implementation of this program here. We already also imported the multiprocessing library.

dvorobej on April 13, 2020

multiprocessing doesnt seem to be working in Jupyter Notebooks. Dan, any advice on how to overcome it? Thanks

Dan Bader RP Team on April 13, 2020

Hmm I’ve never tried running multiprocessing tasks inside a Jupyter Notebook, it’s possible that that’s simply unsupported. You may need to run your multiprocessing code from the command-line or in a different REPL environment.

As an alternative, I found this third-party library called nbmultitask that provides an interactive Jupyter Notebook widget for controlling parallel execution. Hope that helps you out :)

Muhammad Tayyab Asghar on June 7, 2020

@Dan Bader this course is great but one thing i hate is sound of “Mark as Complete” it really gives me a headache. The video is completed and you are thinking about the video lesson and “Boom” there is that sound. Can you please tell how to shut it off.

Arif Zuhairi on Oct. 9, 2020

So is the imported multiprocessing module used in this video part??

Dan Bader RP Team on Oct. 9, 2020

Nope, but we’ll start using multiprocessing in the next video :)

Become a Member to join the conversation.