Async I/O
Concurrency without threads — async/await, the event loop, and when (and when not) to use them.
The problem async solves
Most programs spend the majority of their time waiting — for a database to respond, an HTTP request to come back, a file to load. While one request is waiting, the CPU is idle. With synchronous code, your program just blocks.
Same three fetches. The only difference is whether you `await` them one by one or fire all three first and `await Promise.all`.
Async I/O is a way to keep the CPU busy by interleaving many waiting tasks on a single thread. While one task is waiting on the network, another runs. When the network responds, the first task resumes. This isn't magic — it works because most tasks ARE waiting most of the time.
Async vs threads vs processes
- Async (asyncio) — single thread, cooperative. Best for I/O-bound work with thousands of concurrent operations.
- Threads — multiple threads, preemptive. Easy to share state but limited by Python's Global Interpreter Lock (GIL).
- Processes (multiprocessing) — multiple processes, no GIL. Best for CPU-bound work that can be parallelized.
async / await
Mark a function `async def` to make it a coroutine. Inside, you can use `await` to pause until another coroutine finishes. The event loop schedules everything.
import asyncio
import time
async def fetch(name, delay):
print(f"{name} starting")
await asyncio.sleep(delay) # pretends to be a slow network call
print(f"{name} done")
return f"{name} result"
async def main():
start = time.perf_counter()
# Run three tasks CONCURRENTLY — total time ≈ max(delays), not the sum
results = await asyncio.gather(
fetch("a", 1.0),
fetch("b", 2.0),
fetch("c", 0.5),
)
print(results)
print(f"Total: {time.perf_counter() - start:.2f}s")
asyncio.run(main())If you ran those three fetches sequentially, it would take 3.5 seconds. Concurrently, it takes about 2.0 seconds — the slowest one sets the total. That's the entire payoff.
The golden rule
You can `await` only inside an `async def` function. And once a function is async, the only way to call it from non-async code is via `asyncio.run(...)` (only at the top level) or `await` (from another async function). This means async tends to spread through your codebase — a phenomenon called "function color".
A real-world example: many HTTP requests
import asyncio
import aiohttp # pip install aiohttp
async def fetch_one(session, url):
async with session.get(url) as resp:
return url, resp.status
async def main(urls):
async with aiohttp.ClientSession() as session:
tasks = [fetch_one(session, u) for u in urls]
return await asyncio.gather(*tasks)
urls = ["https://example.com"] * 50
results = asyncio.run(main(urls))
print(results[:3])Fifty HTTP requests, fired off concurrently, in essentially the time of one — because they all sit in `await` together while the network does its thing.