- Регистрация
- 1 Мар 2015
- Сообщения
- 1,481
- Баллы
- 155
In FastAPI, how route handlers (endpoints) behave in terms of parallelism and concurrency depends on whether they are defined using async def or def, and whether the work inside them is I/O-bound or CPU-bound.
Here are the four combinations of route handlers and how they affect parallel or concurrent handling of requests:
1. async def with async I/O-bound work (e.g., await asyncio.sleep, database calls)
@router.get("/async-io")
async def async_io_route():
await asyncio.sleep(2)
return {"status": "async io"}
2. async def with CPU-bound work (e.g., heavy computation, no await)
@router.get("/async-cpu")
async def async_cpu_route():
result = sum(i * i for i in range(10**7))
return {"result": result}
3. def with CPU-bound work
@router.get("/sync-cpu")
def sync_cpu_route():
result = sum(i * i for i in range(10**7))
return {"result": result}
4. def with I/O-bound work (e.g., time.sleep)
@router.get("/sync-io")
def sync_io_route():
time.sleep(2)
return {"status": "sync io"}
Here’s a clear and concise table showing different FastAPI route types, the kind of operation they perform, and whether the request handling is parallel or concurrent:
? FastAPI Route Behavior Comparison
Legend
Here are the four combinations of route handlers and how they affect parallel or concurrent handling of requests:
@router.get("/async-io")
async def async_io_route():
await asyncio.sleep(2)
return {"status": "async io"}
- Handled concurrently
- Non-blocking — multiple such requests can be handled at the same time.
- Best performance for I/O tasks like DB queries, network calls, file access.
@router.get("/async-cpu")
async def async_cpu_route():
result = sum(i * i for i in range(10**7))
return {"result": result}
- Not truly concurrent for CPU-bound work.
- Blocks the event loop — slows down other async endpoints.
- BAD practice — use a thread pool for CPU-bound tasks instead.
@router.get("/sync-cpu")
def sync_cpu_route():
result = sum(i * i for i in range(10**7))
return {"result": result}
- Parallel execution via thread pool executor (Starlette/FastAPI handles this).
- Slower than async I/O but doesn't block the event loop.
- Suitable for CPU-bound work when properly limited.
@router.get("/sync-io")
def sync_io_route():
time.sleep(2)
return {"status": "sync io"}
- Blocks thread and wastes resources.
- Not concurrent nor parallel in a performant way.
- Worst option — avoid using blocking I/O in sync routes.
| Route Type | I/O Type | Concurrent? | Notes |
|---|---|---|---|
| async def | Async I/O | ![]() | Best option for scalable I/O-bound endpoints |
| async def | CPU-bound | Blocks the event loop — BAD | |
| def | CPU-bound | Runs in thread pool — acceptable for CPU tasks | |
| def | Blocking I/O | Blocks threads — worst case, avoid |
Best Practices
- Use async def + await for I/O-bound operations.
- Offload CPU-heavy operations to a thread/process pool (e.g., run_in_executor()).
- Avoid blocking operations like time.sleep() in FastAPI routes.
Here’s a clear and concise table showing different FastAPI route types, the kind of operation they perform, and whether the request handling is parallel or concurrent:
? FastAPI Route Behavior Comparison
| Route Type | Operation Type | Example Code Snippet | Behavior | Notes |
|---|---|---|---|---|
| async def | Async I/O-bound | await asyncio.sleep(1) | Best for DB queries, API calls, file I/O, etc. | |
| async def | CPU-bound | sum(i * i for i in range(10**7)) | Blocks event loop – BAD pattern | |
| async def | CPU-bound (offload) | await loop.run_in_executor(None, cpu_task) | Offloads to thread pool – does not block event loop | |
| async def | CPU-bound (multi-core) | run_in_executor(ProcessPool, cpu_task) | Uses multiple CPU cores – best for heavy computations | |
| def | CPU-bound | sum(i * i for i in range(10**7)) | Runs in thread pool – doesn't block event loop | |
| def | Blocking I/O | time.sleep(2) | Wastes threads – avoid blocking I/O in sync functions |
- Concurrent: Multiple tasks share the same thread (async I/O).
- Parallel: Tasks run in separate threads or processes simultaneously.
- Blocking: One task prevents others from proceeding.
