Uploading Large Videos with Node.js: A Comprehensive Guide

Suneel Kumar
11 min readJul 7, 2024

--

In today’s digital age, video content has become an integral part of our online experience. From social media platforms to educational websites, videos are everywhere. However, handling large video files can be challenging, especially when it comes to uploading them to a server. In this article, we’ll explore how to implement a robust system for uploading large video files using Node.js, Express, and modern web technologies.

Photo by Chris Montgomery on Unsplash

Introduction

Uploading large files, particularly videos, presents unique challenges in web development. Traditional methods of file uploads often fail when dealing with files that exceed certain size limits. This can lead to timeouts, memory issues, and a poor user experience. In this article, we’ll dive deep into a solution that allows for efficient and reliable uploads of large video files using Node.js on the server-side and modern JavaScript on the client-side.

The Challenge of Large File Uploads

Before we delve into the solution, it’s important to understand why uploading large files is challenging:

  1. Server Limitations: Many servers have limits on the maximum size of incoming requests, which can be easily exceeded by large video files.
  2. Timeout Issues: Large uploads take time, and both client and server connections may time out during extended upload processes.
  3. Memory Constraints: Loading an entire large file into memory on the server can quickly exhaust available resources, especially on servers handling multiple concurrent uploads.
  4. Network Instability: Longer upload times increase the risk of network interruptions, potentially causing failed uploads and frustration for users.
  5. User Experience: Without proper feedback, users may be left wondering about the status of their upload, leading to a poor user experience.

To address these challenges, we’ll implement a chunked upload system, which breaks large files into smaller, manageable pieces.

Our Approach: Chunked File Uploads

The core idea behind chunked file uploads is to split a large file into smaller chunks on the client-side, send these chunks to the server individually, and then reassemble them on the server. This approach offers several advantages:

  1. Bypass Size Limitations: By sending smaller chunks, we can work around server limitations on request sizes.
  2. Improved Reliability: If an upload fails, only the current chunk needs to be resent, not the entire file.
  3. Better User Experience: We can provide more accurate progress information to the user.
  4. Reduced Memory Usage: The server only needs to handle one chunk at a time, significantly reducing memory requirements.
  5. Resume Capability: Although not implemented in our current solution, chunked uploads make it easier to implement pause and resume functionality.

Now, let’s dive into the implementation details, starting with the backend.

Backend Implementation with Node.js and Express

Our backend solution uses Node.js with the Express framework, along with several key libraries to handle file uploads efficiently.

Setting Up the Server

First, let’s look at the core setup of our Express server:

import express from 'express';
import morgan from 'morgan';
import cors from 'cors';
import multer from 'multer'
import videoRouter from './routes/video.js'
const app = express();
app.use(express.json({
limit: '500MB',
}));
// Enable CORS for all routes
app.use((req, res, next) => {
res.header('Access-Control-Allow-Origin', '*');
res.header('Access-Control-Allow-Headers', 'Origin, X-Requested-With, Content-Type, Accept');
next();
});
app.use(cors());
app.use(morgan('dev'));
// Error handling middleware
app.use((err, req, res, next) => {
if (err instanceof multer.MulterError) {
if (err.code === 'LIMIT_FILE_SIZE') {
return res.status(400).send('File size limit exceeded (max 500MB).');
}
}
res.status(500).send(err.message);
});
app.use(express.static('public'));
// Route for the home page
app.get('/', (req, res) => {
res.sendFile(path.join(__dirname, 'public', 'index.html'));
});
app.use('/api', videoRouter)
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`app running on port: ${port}`);
});

video.js file:

import express from 'express';
import multer from 'multer';
import fs from 'fs-extra';
import path from 'path';

const router = express.Router();
const uploadPath = path.join(process.cwd(), 'uploads');
const uploadPathChunks = path.join(process.cwd(), 'chunks');

// Ensure the upload directories exist
await fs.mkdir(uploadPath, { recursive: true });
await fs.mkdir(uploadPathChunks, { recursive: true });

Here, we’re importing necessary modules and setting up our file paths. We use fs-extra, an enhanced file system module, to create our upload directories if they don't exist. We define two paths:

  • uploadPath: The final destination for our merged video files.
  • uploadPathChunks: A temporary storage location for individual chunks.

Configuring Multer for File Uploads

Next, we configure Multer, a middleware for handling multipart/form-data, which is primarily used for file uploads:

video.js file:

const storage = multer.diskStorage({
destination: (req, file, cb) => {
cb(null, uploadPathChunks);
},
filename: (req, file, cb) => {
const baseFileName = file.originalname.replace(/\s+/g, '');

fs.readdir(uploadPathChunks, (err, files) => {
if (err) {
return cb(err);
}

// Filter files that match the base filename
const matchingFiles = files.filter((f) => f.startsWith(baseFileName));

let chunkNumber = 0;
if (matchingFiles.length > 0) {
// Extract the highest chunk number
const highestChunk = Math.max(
...matchingFiles.map((f) => {
const match = f.match(/\.part_(\d+)$/);
return match ? parseInt(match[1], 10) : -1;
})
);
chunkNumber = highestChunk + 1;
}

const fileName = `${baseFileName}.part_${chunkNumber}`;
cb(null, fileName);
});
},
});

const upload = multer({
storage: storage,
limits: { fileSize: 500 * 1024 * 1024 }, // 500MB limit
fileFilter: (req, file, cb) => {
if (
file.mimetype.startsWith('video/') ||
file.mimetype === 'application/octet-stream'
) {
cb(null, true);
} else {
cb(new Error('Not a video file. Please upload only videos.'));
}
},
});

This configuration does several important things:

  1. It sets the destination for uploaded chunks to our uploadPathChunks directory.
  2. It generates unique filenames for each chunk, appending a part number to avoid conflicts.
  3. It sets a file size limit of 500MB per chunk.
  4. It includes a file filter to ensure only video files (or octet-streams, which chunks may be identified as) are accepted.

Handling File Chunks

Now, let’s look at how we handle the upload of individual chunks:

router.post('/upload', upload.single('video'), async (req, res) => {
if (!req.file) {
return res.status(400).json({ error: 'No video file uploaded.' });
}

try {
const chunkNumber = Number(req.body.chunk);
const totalChunks = Number(req.body.totalChunks);
const fileName = req.body.originalname.replace(/\s+/g, '');

if (chunkNumber === totalChunks - 1) {
await mergeChunks(fileName, totalChunks);
}

const fileInfo = {
filename: fileName,
originalName: req.body.originalname,
size: req.file.size,
mimetype: req.file.mimetype,
baseURL: 'https://xyz.com/dist/video/',
videoUrl: `https://xyz.com/dist/video/${fileName}`,
};

res.status(200).json({
message: 'Chunk uploaded successfully',
file: fileInfo,
});
} catch (error) {
console.error('Error during file upload:', error);
res
.status(500)
.json({ error: 'An error occurred while uploading the video.' });
}
});

This route handler does the following:

  1. It uses Multer to process the uploaded chunk.
  2. It checks if we’ve received the last chunk (when chunkNumber === totalChunks - 1).
  3. If it’s the last chunk, it triggers the mergeChunks function to combine all chunks into the final video file.
  4. It sends a response with information about the uploaded file.

Merging File Chunks

The mergeChunks function is where the magic happens in reassembling our video file:

const MAX_RETRIES = 5;
const RETRY_DELAY = 1000; // 1 second
const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));

async function mergeChunks(fileName, totalChunks) {
const writeStream = fs.createWriteStream(path.join(uploadPath, fileName));

for (let i = 0; i < totalChunks; i++) {
const chunkPath = path.join(uploadPathChunks, `${fileName}.part_${i}`);
let retries = 0;

while (retries < MAX_RETRIES) {
try {
const chunkStream = fs.createReadStream(chunkPath);
await new Promise((resolve, reject) => {
chunkStream.pipe(writeStream, { end: false });
chunkStream.on('end', resolve);
chunkStream.on('error', reject);
});
console.log(`Chunk ${i} merged successfully`);
await fs.promises.unlink(chunkPath);
console.log(`Chunk ${i} deleted successfully`);
break; // Success, move to next chunk
} catch (error) {
if (error.code === 'EBUSY') {
console.log(
`Chunk ${i} is busy, retrying... (${retries + 1}/${MAX_RETRIES})`
);
await delay(RETRY_DELAY);
retries++;
} else {
throw error; // Unexpected error, rethrow
}
}
}

if (retries === MAX_RETRIES) {
console.error(`Failed to merge chunk ${i} after ${MAX_RETRIES} retries`);
writeStream.end();
throw new Error(`Failed to merge chunk ${i}`);
}
}

writeStream.end();
console.log('Chunks merged successfully');
}

This function does several important things:

  1. It creates a write stream for the final video file.
  2. It iterates through all chunks, reading each one and appending it to the final file.
  3. It implements a retry mechanism to handle potential file system busy errors.
  4. After successfully merging a chunk, it deletes the chunk file to free up space.
  5. If all chunks are merged successfully, it closes the write stream, completing the file.

Error Handling and Cleanup

To ensure our server remains stable and doesn’t accumulate unnecessary files, we implement error handling and cleanup:

router.use((err, req, res, next) => {
if (err instanceof multer.MulterError) {
console.log('Multer error:', err.message);
return res.status(400).json({ error: err.message });
}
if (err) {
fs.readdir(uploadPathChunks, (err, files) => {
if (err) {
return console.error('Unable to scan directory: ' + err);
}

// Iterate over the files and delete each one
files.forEach(file => {
const filePath = path.join(uploadPathChunks, file);

fs.promises.unlink(filePath, err => {
if (err) {
console.error('Error deleting file:', filePath, err);
} else {
console.log('Successfully deleted file:', filePath);
}
});
});
});
console.log('General error:', err.message);
return res.status(500).json({ error: err.message });
}
next();
});

This error handler does a few key things:

  1. It differentiates between Multer-specific errors and other types of errors.
  2. In case of a general error, it attempts to clean up any partial uploads by deleting files in the chunks directory.
  3. It sends appropriate error responses to the client.

With our backend implementation complete, let’s move on to the frontend.

Frontend Implementation

The frontend of our large video upload system consists of HTML for structure, CSS for styling, and JavaScript for handling the chunked upload process.

HTML Structure

Here’s the basic HTML structure for our upload form:

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Large Video Upload</title>
<!-- CSS styles will go here -->
</head>
<body>
<form id="upload-form">
<input type="file" name="video" accept="video/*" required />
<button type="submit">Upload Video</button>
<div id="progress-container">
<div id="progress-bar">
<div id="progress"></div>
</div>
<div id="status"></div>
</div>
</form>

<!-- JavaScript will go here -->
</body>
</html>

This structure provides a simple form with a file input, a submit button, and elements for displaying upload progress.

Styling the Upload Form

To make our upload form more visually appealing and user-friendly, we add some CSS:

<style>
body {
font-family: Arial, sans-serif;
display: flex;
justify-content: center;
align-items: center;
height: 100vh;
margin: 0;
background-color: #f0f0f0;
}
form {
background-color: white;
padding: 2rem;
border-radius: 8px;
box-shadow: 0 0 10px rgba(0, 0, 0, 0.1);
width: 300px;
}
input[type='file'] {
margin-bottom: 1rem;
width: 100%;
}
button {
background-color: #4caf50;
color: white;
padding: 0.5rem 1rem;
border: none;
border-radius: 4px;
cursor: pointer;
width: 100%;
}
button:hover {
background-color: #45a049;
}
#progress-container {
margin-top: 1rem;
display: none;
}
#progress-bar {
width: 100%;
height: 20px;
background-color: #f0f0f0;
border-radius: 10px;
overflow: hidden;
}
#progress {
width: 0;
height: 100%;
background-color: #4caf50;
transition: width 0.3s ease;
}
#status {
margin-top: 0.5rem;
text-align: center;
}
</style>

This CSS creates a centered, card-like form with a styled file input, submit button, and progress bar.

JavaScript for Chunked Uploads

Now, let’s implement the JavaScript that handles the chunked upload process:

<script>
const form = document.getElementById('upload-form');
const progressContainer = document.getElementById('progress-container');
const progressBar = document.getElementById('progress');
const statusElement = document.getElementById('status');

form.addEventListener('submit', async (e) => {
e.preventDefault();
const formData = new FormData(form);
const file = formData.get('video');

if (!file) {
alert('Please select a video file');
return;
}

if (file.size > 501 * 1024 * 1024) { // 500MB in bytes
alert('The file size exceeds 500MB. Please upload a smaller file.');
return;
}

progressContainer.style.display = 'block';
statusElement.textContent = 'Uploading...';

try {
await uploadFileInChunks(file);
statusElement.textContent = 'Upload successful!';
} catch (error) {
console.error('Error:', error);
statusElement.textContent = 'Upload failed. Please try again.';
}
});

async function uploadFileInChunks(file) {
const chunkSize = 10 * 1024 * 1024; // 10MB chunks
const chunks = Math.ceil(file.size / chunkSize);

for (let start = 0; start < file.size; start += chunkSize) {
const chunk = file.slice(start, start + chunkSize);
const formData = new FormData();
formData.append('video', chunk, file.name);
formData.append('chunk', Math.floor(start / chunkSize));
formData.append('totalChunks', chunks);
formData.append('originalname', file.name);

const response = await fetch('http://localhost:3000/api/upload', {
method: 'POST',
body: formData,
});

if (!response.ok) {
throw new Error('Chunk upload failed');
}

const progress = ((start + chunk.size) / file.size) * 100;
progressBar.style.width = `${progress}%`;
statusElement.textContent = `Uploading... ${Math.round(progress)}%`;
}
}
</script>

Let’s break down this JavaScript implementation:

  1. Form Submission Handler:
  • We prevent the default form submission.
  • We check if a file is selected and if it’s within the size limit (500MB).
  • We display the progress container and initiate the upload process.

2. Chunked Upload Function (uploadFileInChunks):

  • We define a chunk size of 10MB.
  • We calculate the total number of chunks based on the file size.
  • We iterate through the file, slicing it into chunks.
  • For each chunk, we create a new FormData object and append necessary information:
  • The chunk itself
  • The chunk number
  • The total number of chunks
  • The original filename
  • We send each chunk to the server using a fetch request.
  • We update the progress bar and status message after each successful chunk upload.

3. Error Handling:

  • If any chunk fails to upload, we throw an error.
  • The error is caught in the submission handler, which updates the status message accordingly.

4. Progress Visualization:

  • We use a simple progress bar to show the upload progress.
  • The status message is updated with the current percentage of the upload completed.

This implementation provides a robust client-side solution for handling large video uploads. It breaks the file into manageable chunks, sends them sequentially to the server, and provides real-time feedback to the user about the upload progress.

Putting It All Together

Now that we’ve gone through both the backend and frontend implementations, let’s recap how the entire system works together:

  1. The user selects a video file using the HTML form.
  2. When the form is submitted, the JavaScript code checks the file size and initiates the chunked upload process.
  3. The file is divided into 10MB chunks on the client-side.
  4. Each chunk is sent to the server as a separate HTTP request, along with metadata about the chunk and the overall file.
  5. The server (Node.js with Express) receives each chunk and saves it to a temporary directory.
  6. When all chunks are received, the server merges them into a single file.
  7. Throughout the process, the client-side JavaScript updates the progress bar and status message.
  8. If the upload is successful, the user is notified. If there’s an error, an error message is displayed.

This system allows for efficient uploading of large video files, overcoming many of the limitations associated with traditional file uploads.

Best Practices and Considerations

While our implementation provides a solid foundation for handling large video uploads, there are several best practices and additional considerations to keep in mind:

  1. Security:
  • Implement user authentication to ensure only authorized users can upload files.
  • Use HTTPS to encrypt data in transit.
  • Validate and sanitize all incoming data on the server-side.

2. File Validation:

  • Implement more robust file type checking on the server-side, possibly using libraries like file-type.
  • Consider scanning uploaded files for viruses or malware.

3. Error Handling and Recovery:

  • Implement a more sophisticated error handling system that can recover from network interruptions.
  • Consider adding the ability to resume interrupted uploads.

4. Scalability:

  • For high-traffic applications, consider using a cloud storage solution like Amazon S3 for file storage.
  • Implement a queue system for processing uploads to handle high concurrent upload scenarios.

5. User Experience:

  • Add the ability to cancel ongoing uploads.
  • Implement a drag-and-drop interface for file selection.
  • Provide more detailed feedback about upload speed and estimated time remaining.

6. Backend Processing:

  • Implement a system for processing uploaded videos (e.g., transcoding, thumbnail generation) using a background job queue.

7. Cleanup:

  • Implement a robust system for cleaning up temporary files and failed uploads.

8. Testing:

  • Thoroughly test the system with various file sizes, network conditions, and concurrent uploads.

9. Monitoring and Logging:

  • Implement comprehensive logging to track uploads and help diagnose issues.
  • Set up monitoring to alert on failed uploads or system issues.

Conclusion

Handling large video uploads is a complex but essential task in many modern web applications. By implementing a chunked upload system using Node.js and modern JavaScript, we can overcome many of the traditional challenges associated with large file uploads.

Our solution provides a robust framework for handling large video files efficiently, with real-time progress feedback for users. It demonstrates how to split files on the client-side, handle individual chunks on the server, and reassemble them into complete files.

While this implementation serves as a solid starting point, remember that each application may have unique requirements. Always consider factors such as security, scalability, and user experience when implementing file upload systems in production environments.

By following the approach outlined in this article and considering the best practices we’ve discussed, you’ll be well-equipped to handle large video uploads in your Node.js applications. Happy coding!

--

--