Optimizing online file uploads with chunking and parallel uploads

Handling file uploads efficiently is a critical aspect of modern web applications. Large files can lead to slow upload times, network interruptions, and poor user experience. In this post, we explore advanced techniques—such as chunking and parallel uploading—for optimizing file uploads, ensuring faster and more reliable performance.
Introduction to file upload challenges
Uploading large files over the internet poses several challenges. Users may experience slow upload speeds due to bandwidth limitations or network instability, and interruptions often force restarts that lead to frustration. Modern web applications require robust upload systems that can handle these challenges while maintaining a seamless user experience.
Explaining chunking and why it matters
Chunking involves splitting a large file into smaller pieces, or chunks. This approach offers several advantages:
- Resumable uploads for improved reliability
- Better browser memory management
- Easier progress tracking
- Reduced impact from network interruptions
- More efficient error recovery
The optimal chunk size depends on network conditions and browser limitations. A dynamic approach as shown below often works best:
const calculateChunkSize = (fileSize) => {
const MAXIMUM_CHUNK_SIZE = 1024 * 1024 * 5 // 5MB browser limit
return Math.min(MAXIMUM_CHUNK_SIZE, fileSize / 10)
}
Setting up a basic file upload with JavaScript
Create a modern file upload interface with progress tracking and cancellation support:
<div id="upload-container">
<input type="file" id="file-input" multiple />
<button id="upload-btn">Upload</button>
<button id="cancel-btn">Cancel</button>
<div id="progress"></div>
</div>
class FileUploader {
constructor() {
this.abortController = null
this.setupEventListeners()
}
setupEventListeners() {
const uploadBtn = document.getElementById('upload-btn')
const cancelBtn = document.getElementById('cancel-btn')
uploadBtn.addEventListener('click', () => this.handleUpload())
cancelBtn.addEventListener('click', () => this.cancelUpload())
}
async handleUpload() {
const fileInput = document.getElementById('file-input')
const files = fileInput.files
if (files.length === 0) {
alert('Please select a file.')
return
}
this.abortController = new AbortController()
for (const file of files) {
try {
await this.uploadFile(file)
} catch (error) {
if (error.name === 'AbortError') {
console.log('Upload cancelled')
break
}
console.error(`Error uploading ${file.name}:`, error)
}
}
}
cancelUpload() {
if (this.abortController) {
this.abortController.abort()
}
}
updateProgress(file, loaded, total) {
const progress = document.getElementById('progress')
const percentage = Math.round((loaded / total) * 100)
progress.textContent = `${file.name}: ${percentage}%`
}
}
const uploader = new FileUploader()
Implementing chunked uploads
Below is an implementation of chunked uploads with proper error handling and progress tracking. It uses a configurable concurrency limit and automatic retries to ensure reliability.
class ChunkedUploader {
constructor(file, options = {}) {
this.file = file
this.chunkSize = calculateChunkSize(file.size)
this.totalChunks = Math.ceil(file.size / this.chunkSize)
this.retryLimit = options.retryLimit || 3
this.retryDelay = options.retryDelay || 1000
this.concurrency = options.concurrency || 3
}
async uploadChunk(chunk, chunkNumber, signal) {
const formData = new FormData()
formData.append('chunk', chunk)
formData.append('fileName', this.file.name)
formData.append('chunkNumber', chunkNumber)
formData.append('totalChunks', this.totalChunks)
let attempts = 0
while (attempts < this.retryLimit) {
try {
const response = await fetch('/upload-chunk', {
method: 'POST',
body: formData,
signal,
})
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`)
}
return await response.json()
} catch (error) {
attempts++
if (attempts === this.retryLimit) throw error
await new Promise((resolve) => setTimeout(resolve, this.retryDelay))
}
}
}
async upload(signal, onProgress) {
const chunks = []
for (let i = 0; i < this.totalChunks; i++) {
const start = i * this.chunkSize
const end = Math.min(this.file.size, start + this.chunkSize)
chunks.push({
data: this.file.slice(start, end),
number: i,
})
}
let uploadedChunks = 0
const updateProgress = () => {
uploadedChunks++
const progress = (uploadedChunks / this.totalChunks) * 100
onProgress?.(progress)
}
// Upload chunks with a concurrency limit
for (let i = 0; i < chunks.length; i += this.concurrency) {
const batch = chunks.slice(i, i + this.concurrency)
await Promise.all(
batch.map((chunk) =>
this.uploadChunk(chunk.data, chunk.number, signal).then(updateProgress),
),
)
}
// Notify server that all chunks are uploaded
await fetch('/complete-upload', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
fileName: this.file.name,
totalChunks: this.totalChunks,
}),
signal,
})
}
}
Parallel uploading: enhancing speed
The approach above leverages parallel uploading by processing multiple chunks concurrently. This technique maximizes bandwidth usage, reduces overall upload time, and prevents overwhelming the server while maintaining browser stability.
Handling errors and retries
Our implementation includes robust error handling through automatic retries, exponential backoff, and graceful cancellation using AbortController. This strategy ensures that transient network issues or server errors do not disrupt the entire upload process. It is advisable to provide clear error messages and differentiate between network failures and application errors, allowing users to retry uploads when necessary.
Ensuring security during uploads
Security in file uploads is paramount. The following example extends the chunked upload mechanism with file validation steps. It validates file signatures, ensures allowed MIME types, and enforces size restrictions. Additionally, consider integrating advanced security measures, such as Content Disarm & Reconstruction (CDR) and antivirus scanning for production systems.
class SecureUploader extends ChunkedUploader {
async validateFile() {
// Validate file signature using allowed types
const allowedTypes = ['image/jpeg', 'image/png', 'application/pdf']
const header = await this.readFileHeader()
if (!this.validateFileSignature(header)) {
throw new Error('Invalid file signature')
}
// Size validation
const maxSize = 100 * 1024 * 1024 // 100MB
if (this.file.size > maxSize) {
throw new Error('File too large')
}
// MIME type validation
if (!allowedTypes.includes(this.file.type)) {
throw new Error('Unsupported file type')
}
}
async readFileHeader() {
return new Promise((resolve, reject) => {
const reader = new FileReader()
reader.onload = (e) => {
const arr = new Uint8Array(e.target.result)
resolve(arr.slice(0, 4))
}
reader.onerror = reject
reader.readAsArrayBuffer(this.file.slice(0, 4))
})
}
validateFileSignature(header) {
const signatures = {
'image/jpeg': [0xff, 0xd8, 0xff],
'image/png': [0x89, 0x50, 0x4e, 0x47],
'application/pdf': [0x25, 0x50, 0x44, 0x46],
}
return Object.keys(signatures).some((type) => {
const sig = signatures[type]
return sig.every((byte, i) => header[i] === byte)
})
}
}
Additional security measures include:
- Implementing Content Security Policy (CSP) headers.
- Utilizing Content Disarm & Reconstruction (CDR) and antivirus scanning.
- Enforcing strict validation of file type and size.
Best practices and optimization tips
- Use Web Workers for intensive file processing tasks.
- Implement client-side file compression where appropriate.
- Cache upload progress (e.g., with localStorage) to facilitate resume capabilities.
- Monitor memory usage during large uploads.
- Provide clear visual feedback on upload status.
- Ensure proper cleanup of failed uploads.
- Leverage modern browser features, such as Service Workers for background uploads and ReadableStream for efficient data handling.
Conclusion: building efficient upload systems
Building a robust file upload system requires careful attention to performance, security, and user experience. The techniques discussed provide a sound foundation for implementing reliable file uploads in modern web applications. By leveraging chunking, parallel uploads, and modern browser APIs, you can build efficient and resilient upload systems. For a production-ready solution that implements these best practices, consider using Uppy 4.0:
import { Uppy } from '@uppy/core'
import { Dashboard } from '@uppy/dashboard'
import { XHRUpload } from '@uppy/xhr-upload'
const uppy = new Uppy()
.use(Dashboard, {
inline: true,
target: '#drag-drop-area',
})
.use(XHRUpload, {
endpoint: '/upload',
formData: true,
fieldName: 'file',
})
These modern approaches, along with careful error handling and security validations, will help you create a seamless file upload experience.