How to build a background task manager with FastAPI
The real-world scenario
Imagine you are a Data Analyst or a DevOps Engineer responsible for processing massive 1GB log files or converting complex CSV reports into JSON. If you trigger this via a standard web request, the connection will likely time out before the work finishes. It is like standing at a coffee shop counter and waiting for the beans to be roasted, ground, and brewed while a line forms behind you. Instead, you want a system where you hand over the file, receive a **Task ID**, and walk away. This script creates an automated **Background Worker** that handles heavy lifting while keeping the API responsive.
The solution
We leverage the **FastAPI** framework and its **BackgroundTasks** utility. This allows the server to return an immediate success response to the user while continuing to execute the processing logic in a separate thread. We use **pathlib** for robust file system management and **uuid** to ensure every task has a unique identity.
Prerequisites
Ensure you have Python 3.9 or higher installed. Install the required dependencies using the terminal:
- pip install fastapi
- pip install uvicorn
- pip install python-multipart
The code
"""
-----------------------------------------------------------------------
Authors: Sharanam & Vaishali Shah
Recipe: FastAPI Background File Processor
Intent: Automate long-running file tasks with non-blocking status tracking.
-----------------------------------------------------------------------
"""
import time
import shutil
from pathlib import Path
from uuid import uuid4
from typing import Dict
from fastapi import FastAPI, BackgroundTasks, UploadFile, File
app = FastAPI()
# Define local directories using pathlib
BASE_DIR = Path(__file__).resolve().parent
UPLOAD_DIR = BASE_DIR / **uploads**
RESULT_DIR = BASE_DIR / **results**
# Ensure directories exist
UPLOAD_DIR.mkdir(exist_ok=True)
RESULT_DIR.mkdir(exist_ok=True)
# In-memory database to track task status
tasks_db: Dict[str, str] = {}
def process_file_task(task_id: str, file_path: Path):
"""
Simulates a heavy processing task like data cleaning or conversion.
"""
tasks_db[task_id] = **Processing**
try:
# Simulate a long-running operation (e.g., 10 seconds)
time.sleep(10)
# Define the output path
output_file = RESULT_DIR / f"processed_{task_id}.txt"
# Read the uploaded file and write 'processed' content
content = file_path.read_text()
output_file.write_text(f"PROCESSED CONTENT:n{content.upper()}")
# Update status
tasks_db[task_id] = **Completed**
except Exception as e:
tasks_db[task_id] = f"Failed: {str(e)}"
@app.post(**/upload**)
async def upload_file(background_tasks: BackgroundTasks, file: UploadFile = File(...)):
# Generate a unique identity for the task
task_id = str(uuid4())
file_extension = Path(file.filename).suffix
local_path = UPLOAD_DIR / f"{task_id}{file_extension}"
# Save the uploaded file to disk
with local_path.open(**wb**) as buffer:
shutil.copyfileobj(file.file, buffer)
# Register the task to run in the background
tasks_db[task_id] = **Pending**
background_tasks.add_task(process_file_task, task_id, local_path)
return {
**message**: **File uploaded successfully**,
**task_id**: task_id,
**status_endpoint**: f"/status/{task_id}"
}
@app.get(**/status/{task_id}**)
async def get_status(task_id: str):
# Retrieve status from the in-memory dictionary
status = tasks_db.get(task_id, **Not Found**)
return {**task_id**: task_id, **status**: status}
if __name__ == **__main__**:
import uvicorn
uvicorn.run(app, host=**127.0.0.1**, port=8000)
Code walkthrough
The script begins by setting up two directories: **uploads** for incoming files and **results** for processed output. We use **pathlib** to ensure this works seamlessly across Windows, Mac, and Linux. The **tasks_db** dictionary acts as a volatile database to keep track of whether a job is **Pending**, **Processing**, or **Completed**.
When you hit the **upload** endpoint, the script saves your file and generates a **UUID**. Instead of processing the file immediately, it calls **background_tasks.add_task**. This tells FastAPI to return the JSON response to the user immediately while the **process_file_task** function runs in the background. The **status** endpoint allows the user to poll for updates until the work is done.
Sample output
After running the script with **uvicorn**, you can test it using **curl** or Postman. When you upload a file, the terminal logs the request, and you receive this response:
{
"message": "File uploaded successfully",
"task_id": "550e8400-e29b-41d4-a716-446655440000",
"status_endpoint": "/status/550e8400-e29b-41d4-a716-446655440000"
}
Checking the status immediately after gives:
{
"task_id": "550e8400-e29b-41d4-a716-446655440000",
"status": "Processing"
}
Conclusion
You have built a professional-grade asynchronous file processor. This pattern is essential for any modern application that handles data engineering or system automation. By separating the upload from the processing logic, you ensure your application remains fast, stable, and user-friendly even under heavy computational loads.
🚀 Don’t Just Learn FastAPI — Master It.
This tutorial was just the tip of the iceberg. To truly advance your career and build professional-grade systems, you need the full architectural blueprint.
My book, FastAPI Crash Course, takes you from “making it work” to “making it scale.” I cover advanced patterns, real-world case studies, and the industry best practices that senior engineers use daily.