Create an Agora Video Track using a Canvas Element

Hermes
Agora.io
Published in
6 min readJun 10, 2024

In today’s multimedia-driven web, integrating live audio and video streaming into web applications is increasingly commonplace. However, developers often have to go beyond standard video integrations and use the <canvas> to augment the existing video stream.

Using the Agora Video SDK’s convenient APIs, developers can easily initialize video tracks using the device’s camera and publish these tracks into a live video stream, but there are use cases where developers need to modify the video before it is published.

In this guide, we’ll dive into how to use a capture stream from a <canvas> element to create a video track supported by the Agora Web SDK.

Prerequisites

Before starting, ensure you have the following:

  • A basic understanding of HTML, CSS, and JavaScript.
  • A code editor — I like to use VSCode.
  • A developer account with Agora.io.

Project Setup

Before we dive into the code, this guide assumes you have a project setup with an HTML file linked to a JavaScript file. This guide will focus solely on integrating the Agora Web SDK and setting up the custom video track from the <canvas> element.

To add the Agora Web SDK to your project, open your terminal, navigate to your project directory, and run the following command:

npm install agora-rtc-sdk-ng

Initializing the Canvas + Video Element

The Agora Web SDK simplifies the management of audio and video tracks separately, allowing you to tailor a custom video element for your application’s specific needs. In a previous guide, we explored the initial steps to set up an Agora video stream with a <video/> element. We’ll build on this by rendering the <video/> onto a <canvas> element.

Use the Agora SDK to initialize the camera track and wrap it in a MediaStream object; used to represent streams of audio/video data. We set the MediaStream as the srcObject for the <video/> element. This creates a <video/> element that we can use in code but since it’s never added to the DOM, it doesn’t appear on-screen.

To draw the video stream onto the canvas, we need a <canvas/> and access to its drawing context. Create a <canvas/> element and call canvas.get2DContext() to get the drawing context. Before we can draw the frames we’ll need to make sure the video is playing, so add an event listener to the video element for the 'play' event.

As soon as the video begins to play, we’ll use the canvas’s 2D rendering context to draw the current video frame onto the canvas. We’ll create a recursive loop with requestAnimationFrame to update the canvas with the latest video frame.

import AgoraRTC from 'agora-rtc-sdk-ng'

document.addEventListener('DOMContentLoaded', async () => {
// Init the local mic and camera
const [audioTrack, videoTrack] = await AgoraRTC.createMicrophoneAndCameraTracks({
audioConfig: 'music_standard',
videoConfig: '360p_7'
});

// Create video element
const video = document.createElement('video')
video.setAttribute('webkit-playsinline', 'webkit-playsinline');
video.setAttribute('playsinline', 'playsinline');
// Create a new MediaStream using camera track and set it the video's source object
video.srcObject = new MediaStream([videoTrack.getMediaStreamTrack()])
// wait for source to finish loading
video.addEventListener("loadeddata", () => {
video.play() // start video playback
})

// create a canvas and add it to the dom
const canvas = document.createElement('canvas')
document.body.appendChild(canvas)

// Draw the video to canvas when video playback starts
video.addEventListener("play", () => {
const ctx = canvas.getContext("2d")
// create a loop to render the video to the canvas
function drawVideoToCanvas() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
requestAnimationFrame(drawVideoToCanvas);
}
requestAnimationFrame(drawVideoToCanvas);
});
})

With the live video stream rendering on the <canvas>, we can draw and manipulate the <canvas> as needed.

Create a custom Agora Video Track

To use the <canvas> as a custom Agora Video Track, we need to create a media stream from the <canvas>, and wrap its video track for use with the Agora Web SDK.

Call canvas.captureStream()to set up a MediaStream object, containing a real-time video capture of the canvas’s content. The captureStream function takes an integer representing the frame rate as an argument, it controls how often the canvas content is sampled. We used the 360p_7 video preset to initialize our video stream, the set the initial video track’s framerate to 30 fps, so set the capture stream to match.

// Define the stream from the canvas
const canvasStream = canvas.captureStream(30);

With the canvas now captured as a MediaStream, extract the video track using canvasStream.getVideoTracks()[0]. This returns the first (and typically only) video track from the canvas stream. Use this video track as part of the config when initializing a custom Agora Video Track. This wraps the video track, making it compatible with the Agora Web SDK's requirements for publishing streams.

// define the stream from the canvas
const fps = 30
const canvasStream = canvas.captureStream(fps)
// use VideoTrack from canvasSTream to create a custom Agora Video track
const canvasVideoTrack = AgoraRTC.createCustomVideoTrack({
mediaStreamTrack: canvasStream.getVideoTracks()[0],
frameRate: fps
})
// publish the audio and canvas track
await client.publish([audioTrack, canvasVideoTrack])

Putting it All Together

We’ve covered each step individually, now let's put it all together.

It starts with initializing an AgoraVideoTrack using a camera as the source. This track is then encapsulated into a MediaStream object. The video from this stream is directed to a <video/> element on a web page, where it is played out.

From this <video/> element, frames are continually written to a <canvas> element using the requestAnimationFrame method, which allows for real-time video frame processing. The <canvas> element generates a new video stream (canvasStream) using its captureStream() method. This stream’s first video track (canvasVideoTrack) is then extracted.

The extracted video track (canvasVideoTrack) is used to create a new custom Agora video track (customAgoraVideoTrack). Finally, this custom video track is published using the Agora Client, making it available for broadcasting or further use in Agora’s ecosystem.

import AgoraRTC from 'agora-rtc-sdk-ng'

// Create the Agora Client
const client = AgoraRTC.createClient({
codec: 'vp9',
mode: 'live',
role: 'host'
})

document.addEventListener('DOMContentLoaded', async () => {
// Init the local mic and camera
const [audioTrack, videoTrack] = await AgoraRTC.createMicrophoneAndCameraTracks({
audioConfig: 'music_standard',
videoConfig: '360p_7'
});

// Create video element
const video = document.createElement('video')
// Create a new MediaStream using camera track and set it the video's source object
video.srcObject = new MediaStream([videoTrack.getMediaStreamTrack()])
// wait for source to finish loading
video.addEventListener("loadeddata", () => {
video.play() // start video playback
})

// create a canvas and add it to the dom
const canvas = document.createElement('canvas')
document.body.appendChild(canvas)

// Draw the video to canvas when video playback starts
video.addEventListener("play", () => {
const ctx = canvas.getContext("2d")
// create a loop to render the video to the canvas
function drawVideoToCanvas() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
requestAnimationFrame(drawVideoToCanvas);
}
requestAnimationFrame(drawVideoToCanvas);
});

// define the stream from the canvas
const canvasStream = canvas.captureStream(30)
// use VideoTrack from canvasSTream to create a custom Agora Video track
const canvasVideoTrack = AgoraRTC.createCustomVideoTrack({
mediaStreamTrack: canvasStream.getVideoTracks()[0],
frameRate: fps
})
// publish the audio and canvas track
await client.publish([audioTrack, customAgoraVideoTrack])
})

Next Steps

And there you have it, how to use the canvas as a video track with Agora’s Video SDK for Web. This example is a great base for building more complex features and interactions, whether for engaging webinars, interactive education platforms, or any other application where live video plays a key role. Feel free to tweak, transform, and take this code to new heights!

Other Resources

  • Dive into the Agora Documentation to better understand the features and capabilities of the Agora SDK. Explore the API reference, sample codes, and best practices.
  • Be part of the Agora developer community: Join the conversation on X(Twitter), or LinkedIn to share experiences, and stay updated on the latest developments.
  • Need support? Reach out via StackOverflow for advice on your implementation.

--

--

Hermes
Agora.io

Director of DevRel @ Agora.io … former CTO @ webXR.tools & AR Engineer @ Blippar — If you can close your eyes & picture it, I can find a way to build it