multiprocessing
The Python multiprocessing
package allows you to run code in parallel by leveraging multiple processors on your machine, effectively sidestepping Python’s Global Interpreter Lock (GIL) to achieve true parallelism.
This package provides an interface similar to the threading
module but uses processes instead of threads.
Here’s a quick example:
>>> import multiprocessing
>>> def worker(num):
... print(f"Worker: {num}")
...
>>> if __name__ == "__main__":
... process = multiprocessing.Process(target=worker, args=(1,))
... process.start()
... process.join()
...
Worker: 1
Key Features
- Spawns processes using an API similar to the
threading
module - Supports local concurrency and remote concurrency with
multiprocessing.managers
and explicit setups - Offers process pools for simple parallel task management
- Provides shared data structures such as queues and pipes
Frequently Used Classes and Functions
Object | Type | Description |
---|---|---|
multiprocessing.Process |
Class | Spawns a new process |
multiprocessing.Pool |
Class | Manages a pool of worker processes |
multiprocessing.Queue |
Class | Provides a queue implementation for process-safe data sharing |
multiprocessing.Pipe |
Function | Returns a pair of connection objects |
multiprocessing.Lock |
Class | Provides a synchronization primitive for process-safe locking |
Examples
Creating a process:
>>> from multiprocessing import Process
>>> def worker():
... print("Worker process")
...
>>> process = Process(target=worker)
>>> process.start()
>>> process.join()
Worker process
Using a pool of worker processes to compute squares:
>>> from multiprocessing import Pool
>>> def square(x):
... return x * x
...
>>> with Pool(4) as pool:
... print(pool.map(square, [1, 2, 3, 4]))
...
[1, 4, 9, 16]
Common Use Cases
- Running CPU-bound tasks in parallel
- Speeding up tasks by utilizing multiple processors
- Managing parallel execution of independent tasks
Real-World Example
Suppose you have a directory of images and you want to apply a computationally expensive transformation to each image in parallel. You can use the multiprocessing
package to speed up this task:
>>> import multiprocessing, pathlib
>>> def process_image(image_path):
... print(f"Processing {image_path}")
...
>>> image_dir = pathlib.Path("images")
>>> with multiprocessing.Pool() as pool:
... pool.map(process_image, list(image_dir.glob("*.jpg")))
...
In this example, the multiprocessing
package helps you distribute the workload across multiple processes, significantly reducing the time needed to process all images in the directory.
Related Resources
Tutorial
Speed Up Your Python Program With Concurrency
In this tutorial, you'll explore concurrency in Python, including multi-threaded and asynchronous solutions for I/O-bound tasks, and multiprocessing for CPU-bound tasks. By the end of this tutorial, you'll know how to choose the appropriate concurrency model for your program's needs.
For additional information on related topics, take a look at the following resources:
- Bypassing the GIL for Parallel Processing in Python (Tutorial)
- Speed Up Python With Concurrency (Course)
- Python Concurrency (Quiz)
By Leodanis Pozo Ramos • Updated July 15, 2025