Concurrency
Concurrency enables your program to handle multiple tasks that overlap in time, such as making several API requests simultaneously or processing items from a queue.
Used well, tools like threads, processes, and asyncio can improve throughput and responsiveness, especially for I/O-bound workloads. However, concurrency also adds complexity. Shared state, race conditions, and subtle timing bugs can make concurrent code harder to reason about, test, and debug.
If you’re already using concurrency, then you can benefit from applying some of the following best practices:
- Reach for concurrency only when necessary. Start by writing clear synchronous code that works, and then measure its performance. Introduce concurrency only to solve concrete problems, such as slow network calls or CPU-bound work that truly requires parallel execution. This practice helps avoid unnecessary complexity.
- Choose the right concurrency model for the job. Use
threadingfor I/O-bound tasks that rely on blocking libraries. Usemultiprocessingfor CPU-bound work, since Python’s Global Interpreter Lock (GIL) prevents CPU-bound threads from running in parallel. Useasynciofor concurrent I/O-bound operations when async-friendly libraries are available. - Don’t block the event loop in async code. When using
asyncio, avoid blocking calls insideasyncfunctions. Use async-compatible libraries or offload blocking work to threads or processes. Otherwise, you lose the benefits of asynchronous execution while keeping its complexity. - Limit shared mutable state. Prefer message passing, queues, and immutable data over multiple threads or tasks mutating the same objects. When shared mutable state is unavoidable, use synchronization primitives, such as locks, queues, and events, deliberately and document the design. This practice reduces race conditions and subtle timing bugs.
- Use high-level primitives instead of rolling your own. Rely on built-in tools like
asyncio.gather(),concurrent.futures,queue.Queue, andasyncio.Queueinstead of hand-rolled worker loops and shared lists. High-level abstractions are easier to reason about and maintain. - Plan for cancelation, errors, and shutdown. Make sure tasks can be canceled cleanly, handle exceptions in worker threads or async tasks, and ensure that any pools, sessions, or executors are shut down properly when your program exits. This practice makes your code more robust and predictable.
To see one of these issues in practice, compare the following two snippets that fetch HTTP status codes concurrently:
🔴 Avoid this:
blocking_io.py
import asyncio
import requests
async def main():
await asyncio.gather(
fetch_status("https://example.com"),
fetch_status("https://python.org"),
)
async def fetch_status(url):
response = requests.get(url) # Blocking I/O call
return response.status_code
asyncio.run(main())
This code looks asynchronous, but it isn’t. The call to requests.get() blocks the async event loop, so other tasks can’t run while each request is in progress. As you add more concurrent work, the program becomes less responsive rather than more.
✅ Favor this:
import asyncio
import aiohttp
async def main():
async with aiohttp.ClientSession() as session:
statuses = await asyncio.gather(
fetch_status(session, "https://example.com"),
fetch_status(session, "https://realpython.com"),
)
print(statuses)
async def fetch_status(session, url):
async with session.get(url) as response: # Non-blocking I/O call
return response.status
asyncio.run(main())
In this version, you use the aiohttp library, which integrates seamlessly with asyncio. In main(), you create a single ClientSession and pass it into each task, allowing the event loop to interleave requests efficiently. As a result, the program stays responsive and scales much better as you add more concurrent I/O work.
Related Resources
Tutorial
Speed Up Your Python Program With Concurrency
In this tutorial, you'll explore concurrency in Python, including multi-threaded and asynchronous solutions for I/O-bound tasks, and multiprocessing for CPU-bound tasks. By the end of this tutorial, you'll know how to choose the appropriate concurrency model for your program's needs.
For additional information on related topics, take a look at the following resources:
- Python 3.14 Release Candidate Lands: Faster Code, Smarter Concurrency (Tutorial)
- An Intro to Threading in Python (Tutorial)
- Python Thread Safety: Using a Lock and Other Techniques (Tutorial)
- What Is the Python Global Interpreter Lock (GIL)? (Tutorial)
- Bypassing the GIL for Parallel Processing in Python (Tutorial)
- Python's asyncio: A Hands-On Walkthrough (Tutorial)
- Asynchronous Iterators and Iterables in Python (Tutorial)
- Getting Started With Async Features in Python (Tutorial)
- Speed Up Python With Concurrency (Course)
- Python Concurrency (Quiz)
- Threading in Python (Course)
- Python Threading (Quiz)
- Thread Safety in Python: Locks and Other Techniques (Course)
- Python Thread Safety: Using a Lock and Other Techniques (Quiz)
- Understanding Python's Global Interpreter Lock (GIL) (Course)
- What Is the Python Global Interpreter Lock (GIL)? (Quiz)
- Hands-On Python 3 Concurrency With the asyncio Module (Course)
- Python's asyncio: A Hands-On Walkthrough (Quiz)
- Exploring Asynchronous Iterators and Iterables (Course)
- Asynchronous Iterators and Iterables in Python (Quiz)
- Getting Started With Async Features in Python (Quiz)