Playing around with WebAssembly

Few years ago, while adventuring into few challenges, I had the chance to create, from scratch, an application which was including a benchmarking suite for FirefoxOS, entirely based on Javascript, HTML5 and CSS.
Apart architectural constraints, I wanted to rely on popular algorithms and strategies widely used on benchmarking, most of them, written in C/C++.

While brainstorming few ideas, I ran into Emscripten, a “LLVM-to-JavaScript compiler”. I would like to explain how happy I was when I saw tons of C code blowing up to hundreds and hundreds of Javascript lines.
At that time, it was what I needed and wanted and I was quite impressed with the results.

A block generated by Emscripten

That was basically the last time I have been playing around with it, but suddenly WebAssembly gets announced and I was up to the game again.

To be honest, I believe it was not received clearly at the beginning, but the overall idea is, in my humble opinion, amazing and striving lots of curiosity and interest.

So, I am there, shuffling again C code with Javascript and wondering about how it works and performances.


I created a basic boilerplate/example to play with it and I was quite astonished by it. Basically, executing sequentially two similar algorithms (laid down as Javascript and Wast), I found out that the Wast code was 10x (TEN) times faster than the native one.
Probably, you might think “Sure, your Javascript code was not optimised, memoized… you didn’t consider loading time…memory…whatever…”, I agree and I am totally conscious about this. BUT, it sounds great to have the ability to run performant code with Javascript and , finally, got the “keys to the lower level”.

This is the main reason why I got excited regarding WebAssembly and I wanted to give it a try ASAP.
I set a simple project up, using Webpack and wast-loader to bundle the WAT files directly.
Below you can have a look at the configuration file:

const { resolve } = require('path');
const webpack = require('webpack');

module.exports = {
entry: {
bundle: './src/index.js'
output: {
filename: '[name].js',
path: resolve(__dirname, 'dist'),
publicPath: '/'

module: {
rules: [
test: /\.js$/,
exclude: [/node_modules/],
use: [{
loader: 'babel-loader',
options: { presets: ['es2015'] },
{ test: /\.wat$/, loader: "wast-loader", exclude: /node_modules/ }

With this configuration, I am generating a basic bundle with all the needed dependencies into it (including WASM stuff already bundled with a ready to use Buffer).
For instance, in this case I got a WAT file defining a simple Factorial approach which gets bundled this way:

/***/ "./src/fact.wat":
/***/ (function(module, exports, __webpack_require__) {

/* WEBPACK VAR INJECTION */(function(Buffer) {module.exports = Buffer.from('0061736d0100000001060160017f017f03020100070801046661637400000a1a0118002000410148044041010f0b2000200041016b10006c0f0b', 'hex');
/* WEBPACK VAR INJECTION */}.call(exports, __webpack_require__("./node_modules/buffer/index.js").Buffer))

/***/ }),

Cool, by this way, we don’t have to spend time fetching WAT files and transforming them to WASM at runtime, but we can just compile and instantiating the whole package and use the exported methods.

Now that we are almost all set, we need an index.js and a WAT file which would contain the WebAssembly module.
A simple example to quickly sum two numbers looks like the following:

(func $add (param $lhs i32) (param $rhs i32) (result i32)
get_local $lhs
get_local $rhs
(export "add" (func $add))

As you can see, the function is defined as “func” and saved into a variable, coming along with the two expected parameters (including the type) and the expected result.
Operations are quite simple, the two defined parameters are added to the stack and i32.add executes the explicit operation using both entries. Computed value gets returned to the caller.

In this case, I decided to use a module which is exposing a method to do a basic factorial recursion:

(func $fact (param $n i32) (result i32)
(if (i32.lt_s (get_local $n) (i32.const 1))
(return (i32.const 1)))
(get_local $n)
(call $fact
(get_local $n)
(i32.const 1))))))
(export "fact" (func $fact))

Pretty neat isn’t it?

At this point, it’s time to “assemble” everything:

import fact from './fact.wat';

.then(module => {
let factorial = module.instance.exports.fact(10);
// 3628800

const factorial = (n) => {
if (n === 0) {
return 1;
return n * factorial(n - 1);
const nativeFactorial = factorial(10);
// 3628800

Yes, result is the same: 3628800. Woohoo!

Probably, you are familiar with the second half of this block; it’s just a really simple function recursing itself to compute the factorial result based on the given parameter.

First half though, it’s the interesting part:

import fact from './fact.wat';

“Import” is quite explicit, it’s just creating a variable with the code I mentioned previously (a Buffer).
In the next 5 lines, you can notice where the “magic happens”.

.then(module => {
let factorial = module.instance.exports.fact(10);
// 3628800

WebAssembly is the agreed standard name for the whole technology. In this case, I am just using the instantiate method exposed by WebAssembly, to compile the given bytes into a WebAssembly.Module and return a WebAssembly.Instance. The explained process is executed asynchronously.

Once the module has been resolved, it’s possible to access the instance and the related exported methods by module.instance.exports.

If you go back to the WAT file showed earlier, you will notice the export which is defining the method to expose with the related name(“fact” in our case).

I went further and integrated couple of dependencies helping me out with tracking performances (skipping initialization and loading times). As mentioned at the beginning of the article, the WebAssembly block performed way better than the classic approach.

My purpose with this article was not (and is not) really to compare the Factorial algorithm or deep performances, but to highlight a topic which looks promising to the Javascript environment.

In the our environment, we should start relying on technologies like this to strive performances and quality at runtime, improving the end user usability whenever possible. Many times we are struggling with these challenges and something handy is always appreciated.

In conclusion, if you are interested to it or not, I would definitely suggest you to at least give it a try and build your own opinion around it 😊 🤙


One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.