Streaming Electron for Fun and Profit

Ben
5 min readJul 29, 2019

--

How we used Azure, Electron, and WebRTC to bring full-fledged browser app experiences to low-powered devices

Stream in Nature, Photo by Hendrik Cornelissen on Unsplash

Hi! I’m Ben Greenier — I’m an engineer at Microsoft working to create awesome open source projects with our partners. We get to create software to help solve really tricky problems, and share our stories as we go. This means that as part of my job, I get to develop with lots of new technologies, learn how to use them, and help other folks do the same.

In this post, I‘ll share a recent sample my team and I built that enables folks to stream Electron apps from the cloud to any internet connected device. I’ll provide some context around the problem we’re solving, outline our overall approach, and share our open sourced components with y’all. In future posts, I’ll dive deeper into the specifics of our technology stack, how this might be hosted, and some of the challenges we ran into along the way.

Why we built this

We worked together with a partner that needed a way to run browser-based experiences, but didn’t want to run a browser. This meant we needed to run the experiences elsewhere, and devise a strategy to allow the visuals to be presented to the customer. We knew from previous work that WebRTC could be a great protocol for creating durable streams of audio, video, and data, so we decided to use that here. WebRTC enabled us to stream experience visuals over the internet to a light-weight native client — just enough to render these visuals, and handle user interaction. Furthermore, we’d use this minimal client to send that user interaction back to the cloud, and once there, to simulate events inside the experience.

With the overall strategy settled, we needed to better understand how we’d host such experiences in the cloud — We settled on using ACI (Azure Container Instances) as it provides low-cost containers that are easy to manage. While a number of different approaches were tested for hosting the app experience inside the container, running Electron proved to be the best balance of the feature-set and performance that we wanted. It also had a huge benefit, in that we could use WebRTC’s existing browser-bindings to make development of the streaming component much more straightforward.

What we built

The core technologies we used: Chromium + WebRTC + Electron

With the goal being a containerized Electron host that can load browser apps and stream them using WebRTC, we started engineering. The sample we ended up with consists of three application processes — one that manages the overall application lifecycle, one that manages the application experience that will be seen by the customer, and one that manages WebRTC streaming aspects. Here’s a brief visual overview:

A high-level process layout of what we built, numbered to indicate the order of operations.

We’ll refer to our application lifecycle manager as the “main process” since it runs first, and orchestrates the other processes. This main process is fairly lightweight — it configures Electron to securely load remote content, parses any configuration or options given at runtime, and creates the other processes and their windows.

The process that starts next contains the application experience that will be seen by the customer — we’ll refer to it as the “App experience” since loading and visualizing the pre-configured remote app is it’s only task. It’s important to note that in our solution, an app is simply any online resource that can be accessed and rendered in a browser by a configurable unique URL.

Finally, the WebRTC streaming process is referred to as the “Streamer experience” as that’s all it provides — a WebRTC channel that captures the app experience and streams it over the internet, as well as an input processor that allows a remote viewer to transmit input to the app experience. It’s worth mentioning that in our sample, this window is usually visually hidden, as showing it adds no value to the customer.

To summarize, we’ve got three processes that communicate with each other, each one responsible for it’s own tasks. This separation of concerns helps us keep our codebase neat and modular, while allowing us to accomplish our goals.

How much does running this sample cost

We chose to use Azure to host this sample, taking advantage of ACI (Azure Container Instances) to run containers that house these Electron-based remote experiences. With each container running a single instance of an experience, and using Twilio’s NTS service to ensure consistent and available internet communications pathways, we were able to keep cost fairly low. It’s worth noting that one could consider using different providers similar to Twilio to achieve the necessary reliability — it may be possible to lower costs even further.

Our rough expected cost per hour per user (in East US data centers) is ~$0.08. That’s ~$0.04 for a single ACI node, and ~$0.04 for 1 GB of data using Twilio’s NTS service. We found using the smallest ACI nodes available worked just great, which really helped us keep costs low. However, your mileage may vary — for more powerful compute, you’ll be charged more.

What we’ve open sourced

We’ve taken the code and knowledge we’ve accrued and open sourced it on GitHub — we hope this can help folks who are interested in similar problems to hit the ground running, and get a feel for what the performance and cost of such a solution might look like. We’re also continuing to learn and improve what we’ve built, and are tracking some changes (including bug-fixes, feature requests, etc) via the GitHub Issues page. Feel free to get involved over there, and ask us more about our work on GitHub.

browserd Open Source project logo

A quick closing reminder — in future posts, we’ll dive deeper into the specifics of our technology stack, how exactly we’ve hosted this, and some of the challenges we ran into while building this out. Be on the lookout for these in the coming weeks!

P.S. — We’re hiring. If you’re looking to join a team that helps Microsoft’s partners solve their trickiest problems (often with cutting edge tech) be sure to get in touch.

Resources

--

--

Ben

I make awesome things | currently @microsoft (@azure) and @coplayfm | sometimes I hang with @msftgarage