Emukit 2: immersive PSX+N64 VR on the web

TLDR: We got immersive PS1 working in Javascript + WebXR + WebAssembly with Exokit. More N64VR games too. Video and code at end of post.

So… with Emukit 2… PlayStation 1 VR is a thing

There’s also a truckload of N64 VR performance and compatibility improvements in the Emukit 2 refresh, so a lot more of your favorite games work in mixed reality:

Perfect Dark 64 VR

Star Fox 64 VR

This post is the how and why of why myself and the Exokit dev team are porting yesterday’s games to web mixed reality. But… if you just want the code or the video, it’s at the end of the post.

Why Emukit?

The point of Exokit (the “post-screen web browser”) is to write a mixed reality web browser in Javascript.
We want to plug the best language/framework in the world — Javascript — into a future where there are no 2D screens.

In 10 years

If we suspend disbelief for a second, imagine a world in which every developer uses their favorite language to code their app. Hopefully that’s Javascript but maybe it’s not — in which case she builds to WebAssembly.

Her code runs automagically on any VR, AR, or neural implant device.

I have no idea how the tech will evolve — which is why Exokit doesn’t place any bets. Exokit is written in Javascript and integrates with whatever mixed reality API you want:

  • Magic Leap
  • OpenVR (Steam VR)
  • Oculus VR
  • Leap Motion Orion
  • Desktop
  • ARKit+ARCore (in future)

An emulator for the web

What’s the piece of software that wires everything together? It’ll certainly need to be doing a lot of emulation.

The piece of software gluing all of this to Javascript would have to take in data from disparate APIs:

  • outside-in VR (sensors shoot to the device)
  • inside-out AR (device shoots to the world)
  • Magic Leap/AR SLAM (localizing in a space without prior info)
  • Leap Motion hands detection machine learning (computing geometry of a thing from a raster image)
  • and so much more

This Javascript layer would also need to work the other way, and convert JS things before presenting it to the hardware. For example:

  • take an ArrayBuffer and submit it to the GPU
  • map a <canvas> to a heaf-mounted display
  • negotiate and configure the HMD
  • merge multiple sites/apps into one cohrerent reality
  • coordinate frame-perfect timing with requestAnimationFrame
  • simulate input events for testing/botting
  • network and multiplayer
  • and again so much more

And this middle layer has to wrap everything under a common API (WebXR) that applications can use to construct a layer of reality and present it to the user.

If this sounds more like an emulator than a browser, I don’t blame you. We call it a browser, but Exokit is actually a 3D emulator for the web.

When I started Emukit (a mixed reality console emulator that runs on Exokit), it was just another step towards taking everything and putting it into web mixed reality.

Also, I suck at making games. Maybe I suck a bit less at plugging the plethora of awesome games that defined my childhood into the matrix (THREE.Matrix4).

How I did PS1 VR

Making the PlayStation (one, not four!) work in VR was not without its challenges. Not the least of which is that the PlayStation is actually a 2D machine.

Yes, the PlayStation is a 2D machine.

You could probably find the place in the Twitch livestreams where after spending a couple of weeks on the problem I finally cracked open the documentation and realized what I was doing might be impossible. Oops!

Obviously we cracked the impossible nut.
Walk with me, down history street and graphics avenue.

At the turn of the dimensions

As I said, the Playstation GPU is actually completely 2D. It can draw some sweet polygons, gradients, Gouraud shading, and even crappy integer-aligned textures. But… it does it all in X and Y. The PlayStation GPU is literally just a megabyte of 2D pixels and nothing else.

This is why you hear that the PSX had no Z buffer. The N64 did, and it’s much closer to what we think of 3D graphics today.

But the PSX had one foot in the SNES and one foot in the Xbox. It couldn’t decide what it was, so it was both.

So how did we get “3D graphics” on the PSX, and what sorcery is powering Emukit? The answer is an epic hack: the Geometry Transform Engine, or GTE.

Since we’ve established that the PSX GPU did not do 3D math, the GTE was a coprocessor on the PSX that did the job. It couldn’t draw anything, but it could do 3D vector and matrix math way faster — and in parallel with — the PSX CPU.

By the way, the PSX CPU really sucked — it was a third as fast as the N64 one, at about 33 Mhz.

PGXP your PSX for WASM WebXR in HMD

Thankfully the problem of adding a third dimension to the PlayStation graphics stack is not a new one. The PSX’s suckage in handing perspective and texture mapping is and old meme. Which means that smarter haxors than I have already haxed it.

The solution is a technique called PGXP, or the Parallel Geometry Transform Pipeline. Essentially this adds another layer of hax to the epic hack that is the GTE.

So imagine we are a PlayStation game. We load our geometry from CD (excruciating load time included at no charge). We submit our geometry vec3’s to the GTE for transforming to our camera Model/View/Projection matrix. We get back 2D points, which we submit as commands to our crippled 2D GPU.

Can you see how to get 3D rendering out of this?

That’s right: we slap in an extra cache for the vertex data coming into the GTE.

When the GTE takes in a point we don’t just output the 2D result for the GPU; we also cache the 3D vertices we were using, based on the X/Y coordinate of the output.

With this mutated monster of a cache — which did not exist on the real PlayStation — we can reach out from the GPU to peek at the original 3D vertices when it’s time to draw. And just like that, we can kinda-sorta transform the 2D GPU into _actual modern GPU_ by taking this now-3D data and feeding it into a vertex/fragment shader pipeline.

Once I got 3D points coming into the GPU it was pretty straightforward to add the standard MR-ification hacks from Emukit’s N64VR implementation.

That is, we use the fact that Exokit is just Javascript to intercept the GL draw calls and double them up, using the two different eye matrices and viewports coming in from WebXR.

Literally rewriting OpenGL

The above isn’t what took me so long with Emukit 2. It took so long because getting things working involved rewriting the Playstation GPU three times, and OpenGL once.

First, a lesson in WebGL.

WebGL is just a Javscript binding for OpenGL (ES). In fact that is _literally_ how Exokit implements it.

And WebGL2 is very close to the full blown OpenGL most applications — inluding games — use. The main omissions are the things that would be rough for Javascript, like memory mapped buffers.

What this means is that you can use Emscripten to compile 3D applications to WebAssembly + WebGLL2 + Javascript and it often just works. That’s the essence of VRifying the N64 — and now — the PSX,

The bumps I ran into were twofold, with the emulators and Exokit pointing in opposite directions of history:

  • The PlayStation emulators were using legacy OpenGL as old as the PlayStation
  • Exokit didn’t support the new WebGL2 endpoints used by Emscripten

To fix the former I ended up rewriting the legacy OpenGL implementation to use a modern matrix, shader, buffer, and drawcall pipeline. Thank Zeus (the god) for Khronos (the consortium)!

The Playstation OpenGL plugins are often complained about as being outdated, so I’ll look into upstreaming this work if it can be done easily.

To fix the latter problem of Exokit not supporting the WebGL2 APIs that Emscripten was using, I simply cracked open MDN for a day (props to Mozilla for fighting the good fight documenting the Web APIs!). And this I implemented WebGL2 in Exokit. This wasn’t hard or complicated at all, since Exokit is written in Javascript.

The cool thing about the new WebGL2 APIs is that they’re “garbage-free”, meaning they take extra offset arguments so you don’t need to construct new typed array views when calling WebGL.

Since calling WebGL is more than likely the hotspot of your 90FPS Javascript mixed reality code, this means using WebGL2 APIs can take a meaningful load off of the garbage collector.

That’s precisely why Emscripten wants to use these new APIs, and why I’m thankful for the stack traces it threw at me. Now that we have WebGL2 in Exokit, everyone wins. And Emukit got marginally faster, even for N64VR.

Speaking of…

Back to past of bringing the past to the future

For Emukit 2 I wanted to do a little extra as thanks to everyone supporting the project. That meant getting some of the N64 fan favorites booting and running faster in Emukit.

To that end, I rewrote the N64VR core to use the new unified MRification pipeline I wrote for PSXVR. If you recall from the last post, we were previously hijacking the GL calls, shaders, and matrices coming out of the emulator, but the emulator itself was basically the stock one from RetroArch, inheriting its compatibility problems.

So I updated and recompiled the N64 emulator from source, backporting some much coprocessor fixes that were getting in the way of the more complicated games booting.

More specifically, In addition to favorites like OOT/Majora and Star Fox 64, Perfect Dark works now. Multiplayer/Combat simulator too!

I also added the Model-View-Projection matrix pipeline to the emulator itself, and baked in the perspective unfolding code to the shader so it runs on the GPU. What this means is that everything should be running smoother with less Javascript garbage generation.

Some games I tested:

Nintendo 64

  • Super Mario 64
  • Star Fox 64
  • Perfect Dark
  • Ocarina of Time
  • Majora’s Mask

PlayStation

  • Metal Gear Solid
  • Crash Bandicoot
  • Final Fantasy 7
  • Spyro the Dragon

Amazingly, I didn’t have to change the matrices for any of these. This is using the scaling constants that we established with Ocarina of Time and it all just works.

I also updated the N64VR controls to work better for games that use the full N64 trident, especially shooters.

What’s next?

With immersive PlayStation and Nintendo 64 in JavaScript, there’s a lot we could do!
Emukit is our way of exploring what we can do with a mixed reality web browser engine written in JavaScript.

Some ideas we’ve been hearing:

  • Add a multiplayer layer with WebSockets, WebRTC, or invent something better like WebUDP
  • Integrate natively with Unity and other engines
  • Componentize Emukit into an entity-component system like A-Frame
  • Run Emukit games as immersive AR portals in Magic Leap One
  • Hack the ROMs with a tad of JS to integrate new control schemes, like Leap Motion

Some of the above may or may not be in progress. I hope to share more soon. ;)

If you have other ideas, the dev team and I are on on Twitch, Twitter, and Discord. Come say hi!

Show me the VR

I made a video walkthrough and of Emukit 2, including how to get started. Also talking about hax.

Show me the code

Both Exokit (the browser engine)and Emukit (the emulator site) are open source on Github.

Issues help. PR’s are a blessing!

❤ from Avaer