Don’t Look Back! a JS13KGAMES 2019 postmortem

Jerome Lecomte
12 min readOct 7, 2019


For the fourth year in a row, I participated to JS13KGAMES, the month-long game development competition in under 13 kB of HTML/CSS/JavaScript that runs from August 13th to September 13th.

This year I made Don’t Look Back!, a WebVR game of cowboys and chickens built with A-Frame and inspired by carnival shooting galleries and my love for pixelart. It placed 8th in the WebXR category and 14th in Web Monetization, earning me another JS13KGAMES t-shirt to continue the series started in 2016.

Play it on JS13KGAMES and have a look at my past postmortems for Blade Gunner (2016), A Tourist In Paris (2017) and SUBmersible WARship 2063 (2018).


This year, I challenged myself to enter the WebXR category, which top prizes were 3 Oculus Quest headsets offered by Mozilla. The theme was “back” and it… did not inspire me at all. Which was a big problem as I normally never start coding until I have a clear idea of a gameplay concept and a visual style that fit the theme.

“Back” evoked travel, such as “There and Back Again” from J.R.R.Tolkien’s The Hobbit and kids song “The Cat Came Back” from musician Fred Penner, both of which would required the creation of a lot of content to be interesting.

It also reminded of the ghosts in Super Mario World’s haunted Mansion, who would stay still when facing them and only move when turning your back on them. However that gameplay seemed impractical for a WebVR experience.

Then I resolved to forget about the theme and instead build a game that interested me: a WebVR version of Starfox on Super Nintendo. At that point half of the compo had already passed, and I severely doubted I could pull something worthy of the top prizes.

So I did something I never did before and started experimenting with one of the WebVR framework to see what they were capable of, and what I could reasonably hoped to achieve in less than 14 days.

Starting from the Hello World of WebVR, I was never able to get the controller of an Oculus Go headset a friend had lent me to work, so I limited myself to gaze-based interactions. I realized that shooting Starfox ships by looking exactly at them would be challenging: by the time your laser bolt reached them they’d have already moved on so firing would need to trigger when looking around them. Static targets it would be then!

Next I toyed with animation, and got a target to fold and spring back the way carnival shooting gallery targets do. Given this would not be very difficult, I added a layer of this classic logic puzzle where you have to turn off all the lights, but switching some off turn some other back on.

Art direction & gameplay

One 3D pitfall I wanted to avoid was hyper realism, as I could not hope to produce high polycount models nor high quality textures in 13 kB. Instead I opted for very simple shapes provided by the WebVR framework mixed with flat blocky sprites leveraging my pixelart skills.

I chose a Western setting with rich vivid colors, in honor of fond memories of Nintendo’s video game Gunsmoke and Konami’s arcade Sunset Riders. Rocks were made of icosahedrons, and cacti of cylinders stacked together. Outlaws and chicken sprites were designed in PyxelEdit using the Dawnbringer 32 color palette.

A game of cowboys and chickens…

I had envisioned sprite animations and Cluck-Cluck-Cluck sounds when you shot the chickens (think The Legend of Zelda: a Link to the Past) as well as a 3-part soundtrack (calm guitar music while you look around, intense banjo music when the outlaws surround you, and celebratory music once you’ve won the game) but ran out of time to design and compose all these assets and tracks.

When it came to naming my game, I took inspiration from a 2017 JS13KGAMES entry that I really enjoyed: Just go straight by Pierre Gimond. The title says exactly what the player has to do to win, but the game cleverly bets on the player wandering off course rather than sticking to the instructions.

Could it really be that simple?!?

So I called my game Don’t Look Back, betting that the player would immediately do what they were instructed not to. The consequence would justify the premise of the game, and reveal the outlaws the player has to defeat.

To be fair, I did warn you…

Although outlaws are the main antagonists, I chose to start the game with 3 chickens to provide a safe playground where the player could get familiar the VR environment and practice the game mechanics on a smaller version of the main puzzle.

Technical choice

The WebXR category lets you choose between 3 frameworks (which don’t count against your 13 kB budget): A-Frame by Mozilla, BabylonJS by Microsoft and Three.js.

BabylonJS and Three.js both advertize themselves as 3D scene graph frameworks built on top of WebGL. The helpers they provide to simplify access to the WebXR API are briefly mentioned in their documentation.

On the contrary, A-Frame touts itself has a XR framework first, a fact quickly apparent throughout its excellent documentation, which made it my top choice for the competition.

A-Frame is built upon WebXR and Three.js, and features at its core a solid Entity-Component-System.

In an ECS, each of your game actors (player, NPCs, enemies, buildings and other level elements) are represented by entities. Entities don’t do much until you apply Components onto them. Components represent properties of your game world, such as position, geometry, material, AI strategy and the likes. At every frame, Systems, which represent your gameplay rules, will loop over every Entities holding Components they’re interested in, and updates these properties accordingly to the game rules (e.g. use velocity Component to update positon Component)

ECS is a pattern I have been progressively adopting in my past JS13KGAMES entries, and shares some parallels with React, which I use extensively in my day job at Shutterstock Custom. This facilitated my ramp up on this new framework.

I implemented 2 custom components:

  • One for the <a-scene> to keep track of the 3 game states (title screen, game screen and end screen) and add/position the right entities for the current state.
  • One for target <a-entity> to set their sprite, handle their link to other targets and their animations up and down. In retrospect, it should have split into 3 separate components to manage each of these different responsibilities.

Overall, I used 1503 bytes for textures in 3 files, 11,597 bytes of HTML for the VR scene and its entities, and 9,956 bytes of JS for the 2 custom components (with comments). Once every resource was optimized and inlined into the main index.html file, the zipped submission bundle only took 4092 bytes out of the 13 kB budget.

Lessons learned

VR Field of View (FOV)

The field of view is a measure of peripheral visions. Humans have a FOV of 210°. VR headsets have a much narrower FOV, typically around 110°, giving the impression to look at a 3D scene through a scuba diving mask.

Due to my laptop’s wide screen, I had a tendency to position the outlaws too far apart from each other. As a result, through the VR headset dead outlaws would spring back to life off-screen, depriving the player from a very important visual cue about the game mechanic and leading them to conclude the game would spawn new enemies at random. I had also placed the timer & shot counter appearing in the HUD at the edge of the FOV, making them almost illegible.

I came up with a hack to avoid this lengthy back-and-forth between my laptop and the VR headset to verify the entities’ position in the scene: I attached a ring entity to my camera, and calibrated its radius to match the edge of the VR headset’s FOV by trial-and-error.

<!-- about 1.2m wide if positioned 1m in front of the camera
for Google Cardboard and Oculus Go -->
<a-ring position="0 0 -1" radius-inner="0.6" radius-outer="0.61>

This allow me to quickly see, from my laptop, what entities would be in focus at any time, and if the HUD was legible.

FOV, not POV

Another gotcha is that A-Frame position the camera a bit lower when entering VR mode on a VR headset than on desktop. I had to tweak a bit the height of the game title and text behind the player to avoid having the player stare up to read them. It’s probably possible to override the default camera position on desktop so it matches the headset one to lead to a consistent experience.

Build script

I expect 2 things of my build script:

  • first, watch for any file change and automatically hot-reload the game in my browser
  • second, minify my code with RollupJS + Terser, inline all my JS, CSS & images in the main HTML file, create the zipped submission bundle and report how much of the 13 kB budget is left.

I’ve moved away from grunt & gulp because they rely in wrapper modules (e.g. gulp-rollup, gulp-terser, gulp-browersync) which usually trail a couple versions behind the modules they wrap . Last year I wrote a custom JS script calling those tools directly via their JS API. It did the trick but had 2 shortcomings: Rollup only watch for JS changes (all HTML, CSS or images changes were ignored) and the script was difficult to extend.

This year, Florent Cailhol shared on Slack a set of npm scripts which achieve something very similar. I added extra npm targets that addressed the shortcomings of last year.

"devDependencies": {
"browser-sync": "^2.26.7",
"chokidar-cli": "^2.0.0",
"html-inline": "1.2.0",
"html-minifier": "4.0.0",
"npm-run-all": "4.1.5",
"rollup": "1.20.1",
"rollup-plugin-terser": "5.1.1",
"terser": "4.2.0"
"scripts": {
"clean": "rm -rf dist && mkdir dist",
"build": "run-s clean build:*",
"build:img": "cp src/img/*.png dist",
"build:js": "rollup -c --environment MINIFY",
"build:html": "html-inline -b dist -i src/index.html | html-minifier -c htmlmin.json -o dist/index.html",
"build:zip": "zip -FS -qjX9 dist/ dist/index.html && advzip -z -4 dist/",
"dev": "npm-s clean dev:*",
"dev:js": "rollup -c -w --environment DEBUG",
"dev:html_img": "cp src/index.html src/img/*.png dist"
"dev:lib": "cp src/lib/* dist",
"dev:serve": "browser-sync start --server dist --files dist --host --https",
"dev:watch": "chokidar src/index.html -d 0 -c 'npm run-s dev:html_img'"

“npm run dev” is used to serve the file locally during development and “npm run build” to create the zipped submission bundle.

“dev” and “build” both start by wiping clean the “dist” directory. “dev:js” uses Rollup to combine all JS files into 1 bundle and watch for any changes to the source JS files to update the bundle again in “dist”. “dev:html_img” and “dev:lib” copy all images, HTML pages and 3rd party JS libraries (such as A-Frame or Soundbox) into “dist”. “dev:watch” uses chokidar to watch for changes on the main HTML file to trigger the “dev:html_img” target. “dev:serve” uses browser-sync to serve the content of the “dist” directory locally, and hot-reload the browser for any changes to files in “dist”.

“build:img”, “build:js” and “build:html” also transfer the source files into the “dist” directory but apply extra transformations: Rollup is configured to run the bundle through Terser for code minification. Html-inline, well, inlines the content of every JS, CSS and images referenced in the main HTML file (images are turned into base64 encoded data URI). Finally, “build:zip” compresses the main HTML file (which at this point contains all the images, CSS and JS code it needs), and calls “advzip” (part of the AdvanceComp tool suite) to optimize the zip even further, reporting its size in bytes in the process.

To me, this is the perfect build script for modern web game development, and I don’t foresee it changing next year.

Things I would do differently

Puzzle design

Once my game mechanics linking 2 targets together (so shooting one would raise the other again) was done, I had to come up with the actual series of linkage that would compose the puzzle. They had to be non-trivial, to contain no infinite loop and still work with 2 extra targets for Coil subscribers.

I split the 12 outlaw targets in 4 groups of 3. Each group was the same as the practice puzzle with chickens. 2 groups had one of their targets linked to one of the 2 remaining groups. The extra Coil subscribers outlaw targets were linked to the group they were the closest, and one to a group in the players back.

An arrow indicates which other targets get revived when a target is shot. Extra targets for Coil subscribers marked by an extra circle.

This failed to be very challenging, as you can win the game by 2 revolutions in either direction.

If you know good resources teaching how to create balanced logic puzzles, please leave me a comment, I would love getting better at this.

Still no content editor

For 2 years in a row, I made games relying on curated content rather than procedural generated levels. I positioned the entities in my scene manually, hand picking coordinates in a lengthy trial-and-error process.

Yet I failed to take advantage of the Web Inspector that comes with A-Frame, letting you apply new Components to the entities of your scene and change their properties on the fly.

A-Frame’s Web Inspector

Composing music is hard

Last year I was lucky to team up with a musician who composed the score of Submersible Warship 2063 on Soundbox.

This year, I envisioned having 2 tracks: a slow, Western guitar tune welcoming the player while they practiced on the chickens and looked around, followed by a fast paced one during the showdown.

I tried my hand at Soundbox, but lacking the necessary understanding of how music is synthesized on a computer, I wasn’t able to produce a sound that even remotely resembled a guitar. Running short of time already, I decided pretty quickly to scrap the music score to focus on polishing the rest of the game to be ready for submission.

I have come across interesting resources on synths and will practice with trackers ahead of next year.


I find WebXR is a category at odd with JS13KGames.

For starter, you almost need to own the headsets given as top prizes to be able to enter the competition properly. Sure you can do most of the coding on a computer, but you won’t be to truly experience VR mode until you put on a headset (for example, to catch problem with scene placement due to the narrow Field of View). A cheap alternative is the Google Cardboard, but you’re still limited to gaze interaction (3 degrees of freedom device). If your gameplay requires pointing at a scene element (3 degrees of freedom controller) or reaching out and catching a scene element (6 degrees of freedom controller), there is no substitute to an Oculus Go/Quest rig to test your game on.

Therefore what headsets you already own or can afford buying greatly limits what game you can make. Similarly, it also limits who can play your game. Some of the other participants’ entries mandated 6 degrees of freedom controllers and I was unable to try them out (when starting the game required picking up an element of the scene) or experience them properly (when half the interaction was out of reach, pun intended).

Finally, each of the 3 WebXR frameworks gives you at the very least a WebGL rendering engine, and in the case of A-Frame a game loop complete with an Entity-Component-System out of the box, without taking a single byte out of your 13 kB budget. This leaves you so much space for levels and gameplay it almost takes away the funniest aspect of the competition.

That being said, I’m very happy this category exists. It was excellent excuse to give WebXR at try, and I had great fun in the making of Don’t Look Back! See you next year…



Jerome Lecomte

Software engineer turned manager and father of four, I create pixel art, video games and visual experiments on the Web.