Uploading large files is often painful, especially with unstable networks. But what if your app could gracefully resume interrupted uploads and retry failed chunks automatically?
In this post, we’ll build a complete FastAPI project with a frontend uploader that:
- Splits files into chunks
- Retries failed uploads
- Resumes uploads from where they stopped
- Merges chunks into a single file
Stack Overview
Component | Description |
---|---|
FastAPI | Backend for API handling |
aiofiles | Async file writing |
Uvicorn | ASGI server |
JavaScript | Frontend chunk uploader |
LocalStorage | Tracks upload progress |
Project Structure
chunked_uploader/
├── backend/
│ ├── main.py
│ └── uploads/
│ ├── temp/
│ └── completed/
├── frontend/
│ └── index.html
└── requirements.txt
Step 1: Backend Implementation (FastAPI)
Install Dependencies:
pip install fastapi uvicorn aiofiles python-multipart
or Using requirements.txt:
fastapi
uvicorn
aiofiles
python-multipart
Backend Logic (main.py
)
from fastapi import FastAPI, UploadFile, Form
from fastapi.responses import JSONResponse
from fastapi.middleware.cors import CORSMiddleware
import aiofiles
import os
app = FastAPI()
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
TEMP_DIR = "backend/uploads/temp"
FINAL_DIR = "backend/uploads/completed"
os.makedirs(TEMP_DIR, exist_ok=True)
os.makedirs(FINAL_DIR, exist_ok=True)
@app.post("/upload-chunk/")
async def upload_chunk(
file: UploadFile,
filename: str = Form(...),
chunk_number: int = Form(...),
):
temp_path = os.path.join(TEMP_DIR, f"{filename}.part{chunk_number}")
async with aiofiles.open(temp_path, "wb") as out_file:
content = await file.read()
await out_file.write(content)
return JSONResponse({"message": f"Chunk {chunk_number} uploaded."})
@app.post("/merge-chunks/")
async def merge_chunks(filename: str = Form(...), total_chunks: int = Form(...)):
output_path = os.path.join(FINAL_DIR, filename)
async with aiofiles.open(output_path, "wb") as merged:
for i in range(total_chunks):
chunk_path = os.path.join(TEMP_DIR, f"{filename}.part{i}")
if not os.path.exists(chunk_path):
return JSONResponse(status_code=400, content={"error": f"Missing chunk {i}"})
async with aiofiles.open(chunk_path, "rb") as chunk:
await merged.write(await chunk.read())
# Clean up
for i in range(total_chunks):
os.remove(os.path.join(TEMP_DIR, f"{filename}.part{i}"))
return {"message": f"{filename} successfully merged."}
@app.get("/uploaded-chunks/")
async def uploaded_chunks(filename: str):
"""Return a list of chunk indexes that have already been uploaded."""
chunks = [
int(f.split(".part")[1])
for f in os.listdir(TEMP_DIR)
if f.startswith(filename) and ".part" in f
]
return {"uploaded_chunks": sorted(chunks)}
Step 2: Frontend Implementation (index.html
)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Resumable Chunked Uploader</title>
<style>
body { font-family: Arial; padding: 40px; }
.progress-bar { width: 100%; height: 20px; background: #eee; border-radius: 5px; margin-top: 10px; }
.progress-fill { height: 100%; width: 0%; background: #4caf50; border-radius: 5px; transition: width 0.3s; }
</style>
</head>
<body>
<h2>Upload File in Chunks with Resume Support</h2>
<input type="file" id="fileInput">
<button onclick="startUpload()">Start Upload</button>
<div class="progress-bar"><div class="progress-fill" id="progressFill"></div></div>
<p id="status"></p>
<script>
const CHUNK_SIZE = 1024 * 1024; // 1MB
async function startUpload() {
const fileInput = document.getElementById("fileInput");
const file = fileInput.files[0];
const progressFill = document.getElementById("progressFill");
const status = document.getElementById("status");
if (!file) {
alert("Please select a file.");
return;
}
const totalChunks = Math.ceil(file.size / CHUNK_SIZE);
const uploaded = await fetch(`http://localhost:8000/uploaded-chunks/?filename=${file.name}`)
.then(res => res.json())
.then(data => new Set(data.uploaded_chunks));
for (let i = 0; i < totalChunks; i++) {
if (uploaded.has(i)) continue;
let retries = 3;
while (retries > 0) {
const chunk = file.slice(i * CHUNK_SIZE, (i + 1) * CHUNK_SIZE);
const formData = new FormData();
formData.append("file", chunk);
formData.append("filename", file.name);
formData.append("chunk_number", i);
try {
const res = await fetch("http://localhost:8000/upload-chunk/", {
method: "POST",
body: formData
});
if (res.ok) break; // success
else throw new Error("Chunk failed");
} catch (err) {
retries--;
if (retries === 0) {
status.innerText = `❌ Failed to upload chunk ${i}`;
return;
}
}
}
const percent = Math.floor(((i + 1) / totalChunks) * 100);
progressFill.style.width = `${percent}%`;
status.innerText = `Uploading... ${percent}%`;
}
// Merge
const mergeData = new URLSearchParams();
mergeData.append("filename", file.name);
mergeData.append("total_chunks", totalChunks);
const res = await fetch("http://localhost:8000/merge-chunks/", {
method: "POST",
body: mergeData
});
const result = await res.json();
if (res.ok) {
status.innerText = `✅ ${result.message}`;
progressFill.style.width = "100%";
} else {
status.innerText = `❌ Merge failed: ${result.error || "Unknown error"}`;
}
}
</script>
</body>
</html>
Step 3: Run It All
uvicorn backend.main:app --reload
Open frontend
Open frontend/index.html
in your browser. No server needed — it’s a static file.
Security & Production Notes
✅ Add JWT auth or OAuth2 for protected uploads
✅ Store final files in S3, MinIO, or GCS
✅ Add checksum validation for chunk integrity
✅ Use database tracking for robust session management
TL;DR Recap
✅ Upload large files in chunks
✅ Resume interrupted uploads
✅ Retry failed chunks
✅ Fully async FastAPI backend
✅ Simple vanilla JavaScript frontend
Leave a Reply