Photo by Peter Stumpf on Unsplash

How Did We Accomplish Video Trimming in Vue.js? A Step-by-Step Guide.

Alp Gökçek
Trendyol Tech
Published in
11 min readMar 14, 2023

--

Video is one of the most engaging and effective ways to communicate and deliver information to an audience. However, sometimes you may need to trim or cut a video to remove unwanted parts, adjust its length, or fit it to a specific format or platform. In this post, I’ll show you how to create a custom video trimmer in Vue.js, a popular JavaScript framework, that allows you to select a portion of a video and export it as a new file. This project is a great example of how to combine canvas rendering, event handling, and reactive data binding to create a complex yet flexible UI component.

The video trimmer we’ll build is a Vue.js component that accepts a list of frames, a video duration, and some configuration options, and renders a slider with two handles that you can drag to select the start and end of the trim range. The slider also includes a time handle that indicates the current playback position of the video, a time ruler with second indicators, and a preview canvas that shows a cropped view of the video within the selected range. Here’s a screenshot of the final result:

Video trim component final result

To get started, we need to create a new Vue.js project and add the necessary dependencies. I assume that you have Node.js and npm installed on your machine. Open your terminal or command prompt and run the following commands:

$ npm install -g @vue/cli
$ vue create video-trimmer

The first command installs the Vue CLI, a command-line interface for creating and managing Vue.js projects. The second command creates a new Vue.js project called video-trimmer using the default configuration. Choose the default options when prompted.

Next, navigate to the project directory and install the following packages:

$ npm install --save video.js videojs-abloop

From now on, we have 3 different problems to solve:
1. Building the VideoPlayer component and extracting frames from a video
2. Creating the video trimming component
3. Trimming the video with FFMpeg

1. Building the VideoPlayer component and extracting frames from a video

Extracting frames from a video can be a useful feature for various use cases, such as creating a thumbnail of a video, generating a preview of the video, or simply creating a slide show of the video. In this section, we will be discussing how to extract frames from a video in Vue.js with Video.js.

We will create a new component called VideoPlayer.vue in the src/components directory. This component will handle playing the video and extracting frames from the video.

<template>
<div>
<video ref="videoPlayer"></video>
</div>
</template>

<script>
import videojs from 'video.js';

export default {
name: 'VideoPlayer',
props: {
sources: {
type: Array,
default: () => [],
},
},
data () {
return {
player: null,
canvas: null,
context: null,
numFrames: 12,
};
},
methods: {
extractFrames () {
// Create a new video element
const video = document.createElement('video');
video.crossOrigin = 'anonymous';
video.volume = 0;
video.src = this.sources[0]?.src;
video.play();

// Handle the 'durationchange' event
const handleDurationChange = () => {
const duration = video.duration;
const interval = duration / this.numFrames;
let currentTime = 0;
const frames = [];

// Create a canvas element to draw the frames on
const actualVideoHeight = video.videoHeight;
const actualVideoWidth = video.videoWidth;
const thumbnailWidth = 200;
const thumbnailHeight = (thumbnailWidth / actualVideoWidth) * actualVideoHeight;
this.canvas.width = thumbnailWidth;
this.canvas.height = thumbnailHeight;

// Extract frames from the video
const extractFrame = () => {
if (currentTime > duration || frames.length === this.numFrames) {
// Set the frames in the store
console.log(frames);
video.currentTime = 0;
return;
}
currentTime += interval;
video.currentTime = currentTime;
this.context.drawImage(video, 0, 0, thumbnailWidth, thumbnailHeight);
const dataUrl = this.canvas.toDataURL();
frames.push(dataUrl);
video.requestVideoFrameCallback(extractFrame);
};
video.requestVideoFrameCallback(extractFrame);
video.currentTime = 0;
};

video.addEventListener('durationchange', handleDurationChange);
},
},
mounted () {
this.canvas = document.createElement('canvas');
this.context = this.canvas.getContext('2d');

this.player = videojs(this.$refs.videoPlayer, {
sources: this.sources,
});
this.extractFrames();
},
beforeDestroy () {
if (this.player) {
this.player.dispose();
}
},
};
</script>

The extractFrames method is responsible for extracting frames from the video and storing them as data URLs in the Vuex store. This is achieved by creating a new video element, setting its src attribute to the first source in the sources array, and playing it.

Once the durationchange event is fired, the handleExtractFrames callback is called. This callback first calculates the duration of the video and the interval between each frame. It then sets up a loop that repeatedly advances the video's currentTime property by the interval and extracts a frame by drawing the video onto a canvas and converting it to a data URL.

The loop continues until either the end of the video is reached or the desired number of frames has been extracted. Once the loop ends, the extracted frames are stored in the Vuex store using the setVideoFrames action.

Note that the requestVideoFrameCallback method is used instead of the requestAnimationFrame method to schedule the next frame extraction. This is because requestAnimationFrame is not guaranteed to run at a fixed interval and may drop frames if the browser is busy. requestVideoFrameCallback ensures that every frame is processed and runs at a more stable frame rate.

Finally, the video’s currentTime property is reset to zero to ensure that the video is in its initial state the next time it is played.

Overall, the extractFrames method is a key part of the VideoPlayer component and is responsible for generating the thumbnail images that are displayed in the video scrubber.

2. Creating the video trimming component

Before we dive into the code, let’s first discuss the requirements of our video trimmer component. We’ll need to display a canvas with thumbnails of each frame, and provide handles that users can drag to select a portion of the video. We’ll also need to display a time handle that indicates the current time of the video, and allow users to drag it to change the current time. Finally, we’ll need to validate the selected portion of the video, ensuring that it meets certain requirements such as a minimum and maximum duration.

To get started, we’ll create a new Vue.js component called VideoTrimmer. We'll define the template of our component to include a canvas element with ref="sliderCanvas", which we'll use to draw the frames and handles. We'll also attach event listeners to the canvas for mouse down and mouse up events, which we'll use to handle the dragging of the handles.

<template>
<div style="width: 100%">
<canvas
ref="sliderCanvas"
@mousedown="handleMouseDown"
@mouseup="handleMouseUp"
style="user-select: none;"
/>
</div>
</template>

Next, we’ll define the script and style part of our component. We’ll need to store the positions of the handles and time handle, as well as the canvas width and height. We’ll also need to store the start and end time of the selected portion of the video, which we’ll use for validation. We’ll define computed properties to convert between video duration and canvas width, which will allow us to position the handles and time handle correctly.

<script>

const Handles = Object.freeze({
START_HANDLE: 'START_HANDLE',
END_HANDLE: 'END_HANDLE',
TIME_HANDLE: 'TIME_HANDLE',
});

export default {
name: 'VideoTrimmer',
props: {
frames: {
type: Array,
required: true,
},
currentTime: {
type: Number,
default: 0,
},
videoDuration: {
type: Number,
required: true,
},
defaultTrim: {
type: Object,
default: null,
},
minTrimDuration: {
type: Number,
default: 8,
},
maxTrimDuration: {
type: Number,
default: 60,
},
},
data () {
return {
startHandlePos: 0,
endHandlePos: 800,
timeHandlePos: 0,
canvasWidth: 800,
canvasHeight: 104,
frameWidth: 80,
handleWidth: 14,
canvasContext: null,
framesCanvas: null,
trimStart: 0,
trimEnd: 0,
selectedElement: null,
};
},
computed: {
durationPositionRatio () {
return this.videoDuration / this.canvasWidth;
},
positionDurationRatio () {
return this.canvasWidth / this.videoDuration;
},
},
watch: {
frames () {
this.setCanvasWidth();
},
currentTime (newValue) {
this.timeHandlePos = newValue * this.positionDurationRatio;
this.drawSlider();
},
defaultTrim (newValue) {
const { start, end } = newValue;

this.trimStart = start;
this.startHandlePos = start * this.positionDurationRatio;

this.trimEnd = end;
this.endHandlePos = end * this.positionDurationRatio;
},
},
async mounted () {
window.addEventListener('resize', this.onWindowResize);
window.addEventListener('mousemove', this.handleMouseMove);

this.timeHandlePos = this.currentTime * this.positionDurationRatio;
this.setCanvasWidth();
this.handleDurationChange();

// Get the canvas element and its context
this.canvasContext = this.$refs.sliderCanvas.getContext('2d');

// Set the canvas width and height
this.$refs.sliderCanvas.width = this.$refs.sliderCanvas?.offsetWidth;
this.$refs.sliderCanvas.height = this.canvasHeight;

// Draw the frames
this.drawFrames().then(() => {
// Draw the offscreen canvas onto the main canvas
this.canvasContext.drawImage(this.framesCanvas, 0, 0);

// Draw the initial slider
this.drawSlider();
});
},

beforeUnmount () {
// Remove the window resize and mousemove event listeners
window.removeEventListener('resize', this.onWindowResize);
window.removeEventListener('mousemove', this.handleMouseMove);
},
methods: {
async onWindowResize () {
this.setCanvasWidth();
this.handleDurationChange();

// Get the canvas element and its context
this.canvasContext = this.$refs.sliderCanvas ? this.$refs.sliderCanvas.getContext('2d') : null;

if (!this.canvasContext) {
return;
}

// Set the canvas width and height
this.$refs.sliderCanvas.width = this.canvasWidth;
this.$refs.sliderCanvas.height = this.canvasHeight;

await this.$nextTick();

// Calculate the new handle positions based on the current trim duration
const durationPositionRatio = this.videoDuration / this.canvasWidth;
this.startHandlePos = this.trimStart / durationPositionRatio;
this.endHandlePos = this.trimEnd / durationPositionRatio;
this.timeHandlePos = this.currentTime / durationPositionRatio;

// Draw the frames
this.drawFrames().then(() => {
// Draw the offscreen canvas onto the main canvas
this.canvasContext.drawImage(this.framesCanvas, 0, 0);

// Draw the initial slider
this.drawSlider();
});
},
async drawFrames () {
// Create an offscreen canvas
this.framesCanvas = document.createElement('canvas');
this.framesCanvas.width = this.canvasWidth;
this.framesCanvas.height = this.canvasHeight;
const offscreenContext = this.framesCanvas.getContext('2d');

this.framesCanvas.fillStyle = '#000000';

// Load all frames and draw them on the offscreen canvas
await Promise.all(
this.frames.map((src) => {
return new Promise((resolve) => {
const img = new Image();
img.src = src;
img.onload = () => resolve(img);
});
})
).then((frames) => {
let x = this.handleWidth;
frames.forEach((img) => {
// Finding the new width and height based on the scale factor
let newWidth = this.frameWidth;
let newHeight = (this.frameWidth * img.height) / img.width;

// get the top left position of the image
// in order to center the image within the framesCanvas
let y = this.canvasHeight / 2 - newHeight / 2;

offscreenContext.drawImage(img, x, y, newWidth, newHeight);
x += this.frameWidth;
});
});
},
async drawSlider () {
// Clear the canvas
this.canvasContext.clearRect(0, 0, this.canvasWidth, this.canvasHeight);

// Draw the offscreen canvas onto the main canvas
this.canvasContext.drawImage(this.framesCanvas, 0, 0);

// Add a white filter over the non-intersecting areas
this.updateGlobalCompositeOperation();

// Draw handle current time
this.canvasContext.fillStyle = '#ffffff';
this.canvasContext.fillRect(
this.timeHandlePos,
0,
this.handleWidth / 2,
this.framesCanvas.height
);
this.canvasContext.shadowOffsetX = '0px';
this.canvasContext.shadowOffsetY = '4px';
this.canvasContext.shadowBlur = 4;
this.canvasContext.shadowColor = 'rgba(0, 0, 0, 0.25)';

// Draw handle 1
this.canvasContext.fillStyle = '#ffffff';
this.canvasContext.fillRect(
this.startHandlePos,
0,
this.handleWidth,
this.framesCanvas.height
);
// Set the font and alignment for the text
this.canvasContext.font = 'bold 12px Rubik';
this.canvasContext.textAlign = 'center';

// Draw the '<' symbol in the middle of the rectangle
this.canvasContext.fillStyle = '#273142';
this.canvasContext.fillText(
'<',
this.startHandlePos + this.handleWidth / 2,
this.framesCanvas.height / 2 + 8
);

// Draw handle 2
this.canvasContext.fillStyle = '#ffffff';
this.canvasContext.fillRect(
this.endHandlePos - this.handleWidth,
0,
this.handleWidth,
this.framesCanvas.height
);
// Set the font and alignment for the text
this.canvasContext.font = 'bold 12px Rubik';
this.canvasContext.textAlign = 'center';

// Draw the '>' symbol in the middle of the rectangle
this.canvasContext.fillStyle = '#273142';
this.canvasContext.fillText(
'>',
this.endHandlePos - this.handleWidth / 2,
this.framesCanvas.height / 2 + 8
);

// Set the font and alignment for the text
this.canvasContext.font = 'bold 12px Rubik';
this.canvasContext.textAlign = 'center';
},
updateHandles (mouseX) {
// Move the closest handle to the mouse
if (this.selectedElement === Handles.START_HANDLE) {
const currentVideoDuration =
this.durationPositionRatio * (this.endHandlePos - mouseX);
if (
currentVideoDuration < this.minTrimDuration ||
currentVideoDuration > this.maxTrimDuration
) {
return;
}
this.startHandlePos = mouseX;
this.trimStart = this.durationPositionRatio * mouseX > 0 ? this.durationPositionRatio * mouseX : 0;
this.$emit('trim-start', this.trimStart);
} else if (this.selectedElement === Handles.END_HANDLE) {
const currentVideoDuration =
this.durationPositionRatio * (mouseX - this.startHandlePos);
if (
currentVideoDuration < this.minTrimDuration ||
currentVideoDuration > this.maxTrimDuration
) {
return;
}
this.endHandlePos = mouseX;
this.trimEnd = this.durationPositionRatio * mouseX <= this.videoDuration ? this.durationPositionRatio * mouseX : this.videoDuration;
this.$emit('trim-end', this.trimEnd);
} else if (this.selectedElement === Handles.TIME_HANDLE) {
this.timeHandlePos = mouseX;
const currentTime = this.durationPositionRatio * mouseX;
this.$emit('current-time', currentTime);
}

// Make sure the handles stay within the slider track
if (this.startHandlePos < 0) {
this.startHandlePos = 0;
}
if (this.endHandlePos > this.canvasWidth) {
this.endHandlePos = this.canvasWidth;
}
if (this.startHandlePos > this.endHandlePos) {
this.startHandlePos = this.endHandlePos;
}

// Redraw the slider using requestAnimationFrame
requestAnimationFrame(() => {
this.drawSlider();
});
},
handleMouseDown (event) {
const mouseX = event.offsetX;

// Calculate the distance between the mouse and each handle
const distStartHandlePos = Math.abs(mouseX - this.startHandlePos);
const distEndHandlePos = Math.abs(mouseX - this.endHandlePos);
const distTimeHandlePos = Math.abs(mouseX - this.timeHandlePos);

// Set selected element to the closest handle to the mouse
if (distStartHandlePos < this.handleWidth * 2) {
this.selectedElement = Handles.START_HANDLE;
} else if (distEndHandlePos < this.handleWidth * 2) {
this.selectedElement = Handles.END_HANDLE;
} else if (distTimeHandlePos < this.handleWidth * 2) {
this.selectedElement = Handles.TIME_HANDLE;
}
this.updateHandles(mouseX);
},
handleMouseMove (event) {
if (this.selectedElement) {
if (event.buttons === 1) {
this.updateHandles(event.offsetX);
}
}
},
handleMouseUp () {
this.selectedElement = null;
},
setCanvasWidth () {
this.canvasWidth = this.$refs.sliderCanvas?.offsetWidth;
this.frameWidth = (this.canvasWidth - (this.handleWidth * 2)) / this.frames.length;
},
updateGlobalCompositeOperation () {
// Set the global composite operation to draw the white filter over the non-intersecting areas
this.canvasContext.globalCompositeOperation = 'source-over';
this.canvasContext.fillStyle = 'rgba(255, 255, 255, 0.8)';
this.canvasContext.fillRect(
0,
0,
this.startHandlePos + this.handleWidth,
this.framesCanvas.height
);
this.canvasContext.fillRect(
this.endHandlePos,
0,
this.canvasWidth - this.endHandlePos,
this.framesCanvas.height
);
},
handleDurationChange () {
if (this.videoDuration > this.maxTrimDuration) {
this.endHandlePos = this.maxTrimDuration * this.positionDurationRatio;
this.trimEnd = this.maxTrimDuration;
this.$emit('trim-end', this.trimEnd);
} else {
this.endHandlePos = this.canvasWidth;
this.trimEnd = this.defaultTrim.end || this.videoDuration;
this.$emit('trim-end', this.trimEnd);
}
},
},
};
</script>

<style scoped lang="scss">
canvas {
border: 4px solid #ffffff;
border-left: 1px solid #fff;
border-right: 1px solid #fff;
border-radius: 8px;
filter: drop-shadow(0px 4px 4px rgba(0, 0, 0, 0.25));
background: black;
width: 100%;
margin-bottom: 4px;
}
.second-indicator{
width: 4px;
height: 4px;
border-radius: 50%;
&--bold{
background: #6E7787;
}
&--light{
background: #AFBBCA;
}
}

</style>

The handleMouseDown and handleMouseUp methods are responsible for handling the user's interactions with the slider handles. When the user clicks down on a handle, handleMouseDown is called, and we set the selectedElement variable to the handle that was clicked down. When the user releases the mouse button, handleMouseUp is called, and we reset the selectedElement variable to null.

The handleMouseMove method is responsible for updating the position of the slider handles when the user is dragging them. If the selectedElement variable is set to a handle, we calculate the new position of the handle based on the user's mouse movement and update the corresponding *HandlePos variable. We then redraw the slider to reflect the new handle positions.

Finally, the onWindowResize method is called whenever the browser window is resized. This method recalculates the size of the canvas and updates the handle positions accordingly.

All of these methods work together to create a dynamic and interactive video trimmer component.

3. Trimming the video with FFmpeg

Although it is possible to use FFmpeg on the client-side to perform video trimming, we decided to handle it on the backend for several reasons. One of the main reasons is the cost. Performing video trimming on the client-side can be expensive, especially if the user has a slow internet connection or if the video is large. Additionally, using FFmpeg on the client-side can be challenging because it requires installing and configuring the software on the user’s computer, which can be a complicated process for non-technical users. By handling video trimming on the backend, we were able to simplify the process for users and ensure a faster, more reliable trimming experience.

Furthermore, we have created a AWS Lambda function that uses the FFmpeg library to optimize the video size and dimensions and apply the video edit parameters which has video trimming props as well. You can accomplish to trim a video with FFmpeg by using the following code:

const optimizeVideoWithVideoEditParameters = function (videoEdit) {
let videoEditParams = [];

if (videoEdit?.trim && typeof videoEdit?.trim?.startTimeMillis == 'number' && typeof videoEdit?.trim?.endTimeMillis == 'number') {
videoEditParams.push("-ss", msToHMS(videoEdit.trim.startTimeMillis), "-to", msToHMS(videoEdit.trim.endTimeMillis));
}

await childProcessPromise.spawn(
FFMPEG_BIN,
['-loglevel', 'error', '-y', '-i', rawVideoFilePath, '-vcodec', 'libx264','-vf', ...videoEditParams, optimizedVideoFilePath],
{
env: process.env,
cwd: workdir
}
);
}

In conclusion, the VideoTrimmer component is a powerful and versatile tool for editing videos. By leveraging the power of HTML5 canvas and Vue.js, we can create a dynamic and intuitive user interface that allows users to easily select and trim video clips. Whether you're building a video editing app, a video-sharing platform, or anything in between, the VideoTrimmer component is a must-have feature.

--

--