Making a User Media Preview Component
--
I recently had the need to build a User Media Preview component for a client’s application.
Essentially, this component displays a list of audio and video sources and allows the user to choose from the lists and preview the selection instantly. Then, a property on the component allows a consumer to get a media constraints object that can be passed to getUserMedia
or a WebRTC framework, like Janus, for example.
In this article, I will run through the steps to create this component using DoneJS and CanJS, but the fundamental concepts I will explain in this article can be applied to just about any framework you want to use, or even plain JavaScript if that’s your jam. Alternatively, you can simply use the DoneJS plugin I created for this article.
Introduction to the MediaDevices APIs
Before we get started, we should talk about a couple of browser APIs, enumerateDevices
, getUserMedia
, and the devicechange
event.
First off: enumerateDevices
returns a Promise
that resolves a list of available media devices connected to the computer, such as built in or USB connected cameras, microphones and speakers.
Note that the label property will always be an empty string if the user hasn’t granted access to their media yet, this is for security purposes.
Running the above code in a browser will output something like:
audioinput: Default - Internal Microphone (Built-in)
audioinput: Internal Microphone (Built-in)
videoinput: FaceTime HD Camera
audiooutput: Default - Internal Speakers (Built-in)
audiooutput: Internal Speakers (Built-in)
We can use this to populate the UI with a list of available input options.
Next up: devicechange
is an event we can listen for on the navigator.mediaDevices
object. It will be fired when the available user media changes, like when the user connects a new webcam or disconnects an existing one.
The above code will log Changed!
when a device is added or removed. If you plugin a device that has two sources, like a webcam with a built-in mic, this even will fire twice. We can use this to know when our device list should be updated without polling.
Finally: getUserMedia
takes a media constraints object and returns a Promise
that resolves an instance of the MediaStream
constructor that represents whatever user media that was specified in the constraints.
Note that once the user gives permission to use their media devices, the webcam light will turn on indicating it is in use.
We can attach the resolved stream to a <video>
tag to display the selected device’s media content to the user.
Putting It All Together
We’re going to start out by making a new CanJS component, you can use the DoneJS CLI to scaffold a basic component for you.
To start out, my JavaScript file looks like this:
We’ll take a look at the .stache
and .less
files later. We’re going to build the view model first. Let’s start by adding a connectedCallback
method. This is a lifecycle hook built into can-component
. It is’s called when the component is inserted into the DOM and may return a teardown function that will be called when the component is removed from the DOM.
On line 5 through 9, we create a function called deviceChangeHandler
that gets a list of devices and attached them to the devices
property on the view model, we immediately call it to initialize this property.
On line 12, we setup an event listener on devicechange
and provide our deviceChangeHandler
function as a handler.
On line 14, we setup an event listener for constraints
on the view model instance with the listenTo
method. This will fire anytime the constraints
property changes, its handler is using the getUserMedia
API to get a MediaStream
with the provided constraints and sets it to the previewStream
property on the view model. Note we haven’t added a constraints
property just yet.
Finally, on line 22, we return a function that when called unbinds our event listeners.
Now we’re going to need to add some properties and getters.
Most of these properties and getters are pretty simple and self explanatory so I won’t go into too much detail, but there are a couple things that are worth mentioning.
On line 12, the constraints
getter will build a constraints object using the selectedVideoDevice
and selectedAudioDevice
properties and return it. The cool thing about this is that we can listen for changes to constraints
and if any of it’s dependencies change, the event will fire with the new computed value of constraints
. This means when we update selectedVideoDevice
or selectedAudioDevice
, our listener on line 85 will fire. This magic is built into DefineMap
, plain objects do not work like this.
On line 40 and 44, you’ll see getters for audio and video devices, they filter the devices
property and return a list of applicable devices. Notice they also take the first item in the list and set it to the selectedVideoDevice
or selectedAudioDevice
accordingly, this will make our component default to the first available device.
Now that we’ve created our view model, we’re going to move on to the view, we’ll write this in our .stache
file.
On line 1, we setup and binding from the previewStream
property on the view model to the srcObject
property on the <video>
element. This means when previewStream
is set, srcObject
property on the element will also be set.
On line 4, we setup a two-way binding from the value of the <select>
element to the selectedVideoDevices
property on the view model. This means when we select an option from the list in the UI, the value will be set to the selectedVideoDevices
property on the view model. Also, when we set the selectedVideoDevices
property in the view model, the <select>
will be set to this value.
On line 5 through 7, we use the each
helper in can-stache
to iterate through our list of selectedVideoDevices
render an <option>
for each item. This list will also automatically re-render if selectedVideoDevices
changes. On line 13 through 15, we repeat this process for selectedAudioDevices
.
Finally, we’re going to add some basic styles in our .less
file just so the component isn’t too terribly ugly. This article isn’t about styling or less, and I am not a designer or CSS expert, so I’m not going to explain this.
At this point, we should be able to test out our component and see it working. I made a simple demo page you can steal to make testing this super easy. Just drop the following snippet in an HTML file and serve it up from your favorite dev server, I use http-server
.
And bam! Our media selector is working! Note that you’ll need StealJS installed in your project for this demo to work.
Wrapping things up
I hope you found this article helpful. If you have any questions or see room for improvements in this article, please feel free to comment or reach out to me on Twitter @imaustink.
Again, if you are feeling lazy or just in a hurry, go ahead and clone/fork my media selector on GitLab or install it via npm.
Some of the topics I am considering covering in my next article are:
- Adding an audio meter to our preview
- Integrating our preview into a WebRTC app
- Integrating our preview into an Electron webcam recording app
Let me know in the comments what you’d like me to write about first!
If this article was informative or helpful, please give it a few claps, and thank you very much for reading it to the end!