Running Nginx with WebAssembly
This is the story of our journey on creating the first WebAssembly runtime able to run Nginx
We started Wasmer with the goal of running standalone, platform-dependent applications on any platform or architecture so that developers can focus on what really matters: reaching a bigger market faster with less effort.
For that, we realized that WebAssembly is the ideal technology to leverage:
- It can compile quickly into efficient machine code.
- Given a well-implemented runtime, the client is completely sandboxed from its host.
WebAssembly has matured enough and this would be the best time to make something that will improve significantly developers’ lives
Wasmer is the first native WebAssembly runtime able to run most (soon all) WebAssembly modules compiled with Emscripten (including Nginx) regardless of the host system: Linux, OpenBSD or Mac (and soon Windows!).
Here’s the story of how everything happened:
Step I: Reviewing Existing Runtimes
Before typing a line of code, we spent quite some time analyzing the different WebAssembly runtime implementations that already existed, in order to see if we could either reuse them or perhaps take inspiration from their approach.
Here are some of the most relevant, and the conclusions we got after analyzing and using them:
- WAVM: well-written project, in C++ that passes the WebAssembly spec. After trying it -and submitting a few PRs- it resulted to be slower than we expected, perhaps because LLVM doesn’t have very fast compilation times.
- Wasmjit: it promised something similar to our goals, but after digging and trying to contribute to it we realized that all the architecture-specific instructions were hardcoded into the runtime itself. This meant a very steep process to implement WebAssembly instructions, and a considerable effort when reaching other architectures (basically the same effort as creating an IR engine like LLVM by hand). And sadly, the library didn’t have either a single test.
- SpiderMonkey (Firefox): Firefox is actually the browser with the fastest WebAssembly execution engine. Currently uses two tiers for WebAssembly compilation: baseline (it's faster to compile, slower to execute) and IonMonkey (a little bit slower, but much faster execution)
After analyzing SpiderMonkey, we discovered a small project inside of Mozilla created as part of their Rustification effort (perhaps because it could eventually be used in Servo): Cranelift.
Cranelift is an IR library that abstracts the hard parts of generating machine code from WebAssembly functions (similar to LLVM, but focused on WebAssembly).
Note: there were a lot of other interesting approaches, such as warpy (Python), nebulet (Rust), wasmi (Rust), wasmtime (Rust), wagon (Go), Life (Go)… that we also analyzed. But we want to keep this article concise and just wanted to cover the most relevant ones.
Step II: Creating our own runtime
In the end, each of this alternatives was built with different tradeoffs in mind. For us, the most important things were:
- It should be easy to maintain and develop
- It should be fast, really fast
- It should support Emscripten, Rust and Go ABIs (so we can run Nginx, Python, ffmpeg, …)
And we concluded that none of the alternatives were doing this in a way that could fit our needs.
So we embarked on our journey in creating our own runtime, written in Rust and based on Cranelift as our IR library.
Step III: Being Spec Complaint
It’s hard to build a WebAssembly engine without tests (how you will know that something is broken then?) so one of our first goals was to create a set of tests that assure that everything worked the way it should.
Thankfully for us, the WebAssembly group already provides with a number of tests that should pass in any WebAssembly runtime (and that is used by almost all browsers to verify their implementation).
So, inspired by the greenwasm spec test implementation, we created a
.wast (WebAssembly script text file) to
.rs (Rust) transpiler, so our tests will be autogenerated based on the official WebAssembly spec tests.
After working on this for a few weeks, our implementation passed almost 100% of the official WebAssembly spec-tests, with few exceptions made to fasten the time to release.
We were then ready for our next step …perhaps the most important one.
Step IV: Emscripten
Our WebAssembly engine was ready: we were spec complaint with an architecture we felt comfortable on using. Where we should start? Nginx directly?
Emscripten is a great tool that let you compile C and C++ libraries into WebAssembly. We thought about running Nginx because its written in plain C and it could be built without any extra dependency, so we can generate a wasm file from an unmodified version of Nginx.
If we had tackled Nginx directly we would probably end up frustrated and not productive. Nginx has a lot of syscalls that we would need to wrap and recreate in order to have it working.
So we started with a smaller project: running a simple python file transpiled into C (using rpython) and transformed again to WebAssembly using Emscripten. It will have enough syscalls to let us build the confidence of creating a framework that we could sustain.
And we got it running!
After running successfully our
examples/pypyjs.wasm trial, we were ready to take on Nginx!
So we started mapping almost all the syscalls that the Emscripten Nginx version needed to run properly.
However, this path was quite challenging as it required us to transform back and forth structs from WebAssembly to the inner implementation (libc).
After almost two weeks of trying, a lot of coffee and a great team… success!
Want to try Nginx locally? You are just three commands away!
1. Install Wasmer
curl https://get.wasmer.io -sSfL | sh
2. Download the Nginx example
git clone https://github.com/wasmerio/wasmer-nginx-example.git
This is a simple example of Nginx running with wasmer - wasmerio/wasmer-nginx-examplegithub.com
3. Run Nginx with wasmer! 🎉
wasmer run nginx.wasm -- -p . -c nginx.conf
Et voilá, you will have your server running in http://localhost:8080/
Hope you enjoyed reading this story as much as we did working on it!