In March 2015 I started a small pilot project to ship some Rust code in the Firefox web browser. It was the first Rust code to ship in Firefox, and the first deployment of Rust code to hundreds of millions of users around the world. This is my story.
Rust was (is!) a new language which began as a side project at Mozilla and has blossomed into a popular programming community. It aims to bridge the world of performance-critical application programming with the ergonomics and security of scripting languages. I’m hopeful that we’ll finally be able to replace C++ with something better. But first, we have to see if it will work.
To start we wanted something small and easy to revert to minimize the damage if things didn’t work out. And we didn’t want to waste our time re-writing useful code. So we chose the mp4 demuxer.
Parsing mp4 video files.
It may seem like an odd choice. Video involves large data volumes, so it has to be fast. Video playback is a pretty common thing, so it’s a core feature. Rust does produce fast code, but didn’t want to rely on that for our test, so we chose a place to start where correctness is more important than performance.
I work in the Firefox media playback team, and we’d rejigged our mp4 parser several times already. We had looked around for a widely-used, open source library we could adopt and settled on stagefright. A billion Android handsets couldn’t be wrong, we thought.
It was a good permissive parser, and was straightforward to integrate, so it met our immediate needs. But with more experience we realized it wasn’t going to work for us. The code takes a very minimalist approach, which is good for handling all the non-compliant files one encounters on the Internet. Unfortunately it was also minimalist in its error checking, so we suspected it was full of security vulnerabilities. As a C++ application we didn’t even have the benefit of Android’s JNI sandbox.
So we’d been replacing it piecemeal with from-scratch C++ code, and by the time we were looking for a Rust pilot project, we were only using it for parsing the header and track metadata. We had a library with limited scope which wanted to re-write anyway, which wasn’t called in any inner loops. Perfect.
I spent about 6 weeks learning Rust and writing the first version of the parser. Soon after, my colleague Matthew Gregan joined the project and got
it up to the minimum feature set we need. Later in the project another colleague, Alfredo Yang, stepped up to continue development of the library.
But that was the easy part. A large application like Firefox has complicated build and test automation, which has often grown organically over many years. Adding a new compiled programming language to that isn’t trivial.
In fact, I spent the better part of a year working almost full time on integration issues before we were able to ship the new code to all our users. That was more work than I or anyone else expected, I think. But it paid off, enabling a much larger effort to improve responsiveness and security in Firefox.
Rust is more mature now, so it is easier to try something like this today, but I wanted to share of the experiences I had along the way.
Firefox supports quite a few major platforms, and volunteers keep it running on numerous others. Rust started out with broad support for the three most popular desktop platforms. That’s pretty good for the initial release of a compiler, but we had a lot of work to do to get Rust code shipping everywhere we wanted to.
Code generation was straightforward thanks to LLVM, but there was a lot of work to be done in the OS-support layers of the standard library, and getting configurations right.
The Rust team were very supportive, and did a tremendous amount of work to add support for Windows XP, Android, older MacOS, and MS Visual Studio linkage. Still, it was almost six months between shipping on our first platform (x86_64-linux) and our last (armv7-android).
Build systems are fun
Rust has a build system and package manager called `cargo`. It’s amazing, one of the best things about the Rust language ecosystem. It makes it easy to package software, maintain dependencies, and get a consistent build. However, it’s something of a world unto itself. By trying to handle everything, it made integrating Rust code into an established project more difficult.
Early foreign-function support in Rust concentrated on being able to wrap existing C libraries to quickly bootstrap functional Rust applications. While cargo had support for building external code and linking it into a Rust project early on, it didn’t really support the reverse.
Firefox also has a build system which wants to be in charge. It’s very large and complicated, and wants to control all the things. So getting the two to work together took changes to both and is still an ongoing project.
The simplest way to integrate Rust code into a C or C++ project is to use rustc’s (or cargo’s) support for generating static libraries. This will link all the rust code and its dependent libraries into a form which can be combined with the main body of the application like any other C library.
That works well when you only have one bit of Rust. As things progress and you end up hooking up a piece here, a piece there, this approach doesn’t scale.
Each of those static libraries produced by the Rust compiler contain their own copy of the Rust standard library. Some linkers will cope with that, while it’s not ideal. But others will become confused and reject Rust libraries after the first.
We hit this almost immediately. While we started with a small pilot project adding a single component, we wanted to have some unit tests for it. Unit tests run with a test harness which is linked to rendering engine code… which means we have Rust code from the rendering engine, and Rust code from the unit test.
We resolved this by building a Rust super-library that just re-exports all the public interfaces of all the Rust code modules we actually want to have available. Each top-level object the Firefox build system defines its own version of this super-library, which it compiles to static library and links in. This means only one copy of each Rust code module and the standard library, and the Rust compiler can fully optimize everything together without interference from the rest of the build system.
This wasn’t entirely obvious, since it’s different from how most Rust projects work, but it does allow us to use cargo to build all the Rust code and take advantage of its features as a package and dependency manager.
There are still some issues we need to solve with this. For example, using cargo’s ‘workspaces’ feature to share build artifacts when constructing multiple static libraries. The unified build also makes it harder to vary configuration flags between different sections of the Rust code, for example if one module wants to always enable debug assertions and another doesn’t.
Updating build automation
The final hurdle, and one specific to our setup but not unusual for large projects, was deploying Rust to our build and test infrastructure. Mozilla pioneered running tests as commits land in version control, which is now standard practice. But while services like Travis got Rust support relatively quickly, packaging and deploying a new compiler to our fleet of test machines and supporting all the build variants we create took quite a lot of time.
There were dozens of test configurations to plumb though, some requiring a Rust toolchain, some requiring its absence. We needed a way to package versions the compiler and tools like cargo to use for official builds so we had a consistent basis for Firefox releases as they progressed through stabilization and testing.
Of course we have to do this with the C++ and Java toolchains, platform SDKs and so on as well. But here Rust’s newness and active development posed a new challenge. We could never rely on its presence in base Docker or VM images, and a new, backwards-compatible with older code, stable release every six weeks meant we had to verify and update all that machinery more often than we were used to. I made a lot of people pretty grumpy on that score.
In a similar vein, Firefox has a bootstrap script to help developers setup up a programming environment for working on the engine and application code. Rust’s installer is pretty great, with nvm-like features, but I still spent a fair chunk of time adding support for it there too, so developers uninterested in Rust could stay up to date without having to learn a new tool.
Snappy conclusion goes here
Those were some of the major hurdles I faced shipping a pilot project with a new compiled language in a major software application.
The Rust community is very helpful and supportive. They’ve done a great job developing a participatory culture, and it’s easy to ask questions, or for help, or to contribute to the language and tools. It was very exciting to see my beginning Rust code executed the first billion times.
Ultimately the technical hurdles in a project like this are not the limiting factor. Large organizations don’t adopt new ideas quickly, and much of the work involved convincing others this was a worthwhile experiment, and mitigating issues affecting the work of other developers. Ultimately that, and the corresponding help I received from others, are what made the project a success. Thank you all.
Work goes on, of course. Many more people at Mozilla are learning the language. Colleagues have been (re)writing various utilities in Rust. We’re merging parts of the experimental servo browser engine into Firefox.
We’re building further infrastructure to maintain copies of Rust software packages in the Firefox source tree. Cargo makes it easy to share third-party code, but we don’t want our work blocked on external repositories, and sometimes we need to make customizations. There are interesting ideas about automated license checking and setting appropriate code review requirements for different pieces.
Only time will tell how successful Rust will be overall, but it’s off to a promising start. The Rust language is great, and it was exciting to be able ship code in it to a large audience. I hope Mozilla’s commitment will encourage others to adopt it for their own products and services. It’s time to make software better.