Learning Fast, Failing Forward

My First Steps in Magic Leap’s Independent Creator Program

Apr 4 · 13 min read

An email with “URGENT” in the subject line was waiting in my inbox one Saturday morning. I was surprised to see that it had been sitting there since early Friday afternoon. I was more surprised to see that it was from Magic Leap, a company that recently released its first augmented reality device.

And then I realized what the email was: an letter of acceptance into Magic Leap’s inaugural Independent Creator Program. The letter requested more details from me and outlined some details — that there had been over 6,000 proposals submitted for the program, that mine was one of the few they had selected, and that I would receive some hardware, grant money, and other assistance from Magic Leap as part of the program.

When the email arrived, it had been a few months since I had submitted my project proposal: to bring my virtual reality app, called EXA: The Infinite Instrument, into augmented reality. Using EXA, players can make music, design custom instruments, record their performances, and create virtual bands. And most recently, EXA’s new multiplayer system allows players to do all of those things together with other players.

About three weeks ago, with the initial discussions and paperwork completed, Magic Leap publicly announced the program’s first wave of about thirty “Creators”. After keeping quiet about the program, I was excited to finally share the news: my one-man software company, Aesthetic Interactive, was one of those Creators. I had also received my Magic Leap One device, and was ready to dive into development.

(A note on timing: I’m publishing a couple weeks late. The announcement from Magic Leap was on February 27, and I wrote this article on March 17.)

First Milestone: an EXA band playing in augmented reality. Filmed on a Magic Leap One device.

The Journey Begins

These past few weeks have been quite a whirlwind — flying across the country on a last-minute trip, navigating a sea of documentation, and scaling a mountain of technical challenges.

The difficulties came from all sides: a new device, new tools, new workflows, unfamiliar operating system, and EXA’s technical requirements. My “getting started” challenges were compounded by various factors — travel, splitting work between computers, a few stubborn issues, a few tangents, and the need to also create a native C++ audio plugin (which is outside the scope of most Unity projects). Stumbling over one obstacle after another, my initial journey on this project hasn’t been pretty.

Not long ago, I faced a similar situation with my EXA multiplayer work. I was getting frustrated, and when my kids asked why, I told them I was getting tired of failing over and over again. They looked concerned, so I decided to change my tune. Sometimes we try to do difficult things, and make lots of mistakes, I told them. But every time we fail, we can learn a little more, and get a little closer to the goal. What’s important is to keep failing forward.

That seemed to resonate with them, and the phrase has stuck with me, too. It also happens to describe my initial journey with Magic Leap development quite well: learning fast and failing forward, all the way to my first milestone.


The first phase of the Creator Program was to attend a “bootcamp” event hosted by Magic Leap. The event was optional, but it made sense to go — I would meet the program’s leaders, get to know the other teams, receive direct help from Magic Leap engineers, watch some presentations, and generally jump-start my project.

A miscommunication about bootcamp locations required me to make some last-minute travel arrangements, and I was soon on a flight from Michigan to California. Even though I’m not a big fan of traveling, and even though I booked a hotel room with a barely-usable shower, the trip was definitely worth it.

Bootcamp packed quite a bit into two-and-a-half days. We had time to work on our projects, with Magic Leap staff on-hand to answer questions about anything from compiler errors to interaction design. Each team had an extended technical discussion with Magic Leap engineers, tailored to our project’s specific needs. We filmed optional interviews to describe our projects and goals. We got to know the other teams, learned about their projects, received an overview of the Creator Program, and attended break-out sessions that seemed relevant to our projects.

Bootcamp introduced me to a few great contacts for technical issues, a handful of projects that might have some challenges similar to mine, and of course, dozens of amazing Creators. I also learned that the program has only two solo Creators, myself included, and that the timeline for my project seems to be shorter than most. No pressure.

Because bootcamp was hosted at SVVR (Silicon Valley Virtual Reality), there were some virtual reality stations available. I installed EXA on two of them, and found a few people willing to try it out. It’s so much more interesting to show people how the app looks and feels, rather than just describing it… and even better to explore it together in multiplayer. There happened to be some musicians at bootcamp, so we also had a couple impromptu EXA jam sessions together via the local network.

Bootcamp was largely a positive experience, but was also a time of mounting frustration and concern for me. As I dove into my project for the first time, I met an increasing array of issues. With each new obstacle I attacked, a few more would appear. Questions began to creep in. How long is this going to take? Is this even something I can solve? Working with so many new pieces, it wasn’t always clear whether issues were coming from my side, or theirs; it was tough to know whether I was even on the right track.


Before anything with my project could begin, I had to setup and configure my computers and the Magic Leap One device for development work. I decided to get this started before leaving for bootcamp, so I could focus on more important things while I was there.

Magic Leap’s documentation for getting everything setup was quite good. Still, there were quite a lot of steps, including installing a custom version of Unity, and several new pieces and tools and workflows to learn.

To further complicate things — which seems to happen quite a lot with my project — I needed to perform setup for both my PC development machine and my Mac laptop. This doubled my setup time, of course, and I had to be careful to follow some different instructions for each operating system.

I wouldn’t have bothered to setup on Mac, but I needed my laptop for the upcoming bootcamp. I prioritized the Mac setup for this reason, but also because I couldn’t get my PC to recognize the device.

For a week after returning from bootcamp, that PC issue took its toll on my efficiency. I’ve spent years refining my PC development environment, with tools and shortcuts how I want them, everything streamlined and familiar— not so much with my Mac. I finally got tired of feeling slow and clumsy, and set aside development to fix the PC problem.

I tried performing just about every variation of starting the device, plugging into USB, opening the “remote” tool, and searching for connected devices. I finally found my answer in two separate comments, mixed among several other suggestions in this discussion thread. I deleted some Windows registry keys, uninstalled a generic-looking USB device driver, and finally, there it was: a connected Magic Leap device.

Along the way I also learned about, experimented with, and incorrectly used things like debugging tools, command line interfaces, Magic Leap plugins for Visual Studio, and new Unity configurations. For each, I gradually figured out what I needed, what I didn’t, and how to use it properly for my project. This was all a bit overwhelming at first, until I started to see how all the pieces would fit together in my development workflow.

Finding A Milestone

Once my initial setup was complete, I took a long-shot for my first real build for the device. I made a full copy of my EXA project source, ripped out the support for virtual reality devices, disabled anything that wasn’t essential, dropped in the Magic Leap assets, and hit the build button.

It compiled, but it definitely didn’t work.

I was still very new to everything Magic Leap at this point, and didn’t even know how to obtain the app’s debugging logs. All I knew was: when I launched my build on the device, nothing happened at all.

That’s a tough place to start, so I went in the opposite direction. Within an hour, I had created a simple build with physics-enabled cubes floating around me, all waiting to be bopped around with the controller. The documentation and assets from Magic Leap were all clear, and making this scene was very straight-forward. I spent a few minutes wishing EXA could somehow be that simple, enjoyed a few minutes playing with my augmented reality cubes, then decided to get on with it.

My first all-or-nothing approach was impractical, so I settled on a middle ground: I’d get a “watch-only” EXA performance to play on the device. To do this, I could use a tool that I had already built, called “EXA Remix”, which allows third-party Unity apps to load, customize, and play performances that were originally created using EXA.

This “watch-only” concept worked well as my first major milestone—it incorporated the app’s basic setup, many of its graphics and effects, and the loading and parsing of files (both JSON and audio data). Importantly, it would also require me to address my biggest known technical challenge: modifying EXA’s native audio system to work on Magic Leap’s Android-based operating system.


My first step toward this milestone was to figure out how to access my app’s debugging logs — development work would be hopeless without those.

A search for “debug logs” in the documentation took me on a long detour with a debugging tool called DERP. This required installing the tool, adding and configuring some related assets into my Unity project, and connecting my build to the tool over local network. I got it working, only to realize there’s an easier way. DERP has some benefits, like sending commands into the running build, but I’d rather not have spent the time on it.

The easier way to access logs is to use the MLDB command line tool. Here are some commands [in brackets] that might be helpful:

  • [mldb log] prints logging output directly into the terminal. Use keyboard CTRL+C to exit the printing mode and return to your terminal prompt.
  • [mldb log -c] clears the logs.
  • [mldb log -c && cls && mldb log] clears the logs, clears the terminal, then starts printing again. The “&&” conditionally chains commands together on a PC terminal.
  • [mldb log > C:/path/to/log.txt] pipes the logging output to a file. This one’s my favorite, because it allows me to use an app like BareTail to search, filter, and color-code logs in real-time. Highly recommended.
  • [mldb terminate com.companyname.appname] quits the specified app, via its package ID, so you don’t have to do it within the device. This is nice for debugging, especially when there’s runaway audio playback that needs to stop. Immediately.
  • [mldb] provides a full listing of its sub-commands and options.

These commands are slightly different on Mac — the [&&] becomes [;], and the [cls] becomes [clear]. The keyboard CTRL+C stays the same, however; it does not become Command+C, as I kept trying at first.

Loading Files

With logging ready to go, I started working toward my first milestone, and immediately encountered an unexpected obstacle — I couldn’t get files to load on the device. A music app that can’t load its instruments or sounds is not very useful, so this issue became a blocker during my time at the bootcamp, and a few days after.

EXA uses a standard approach for loading files: it places the files (including pre-installed performances, instruments, and sounds) into the Unity “StreamingAssets” folder, and uses typical .NET methods for accessing them (such as FileStream and StreamReader). However, when trying to load these files on the device, I received an UnauthorizedAccessException error. Essentially, this means that the app does not have permission to load the requested file or directory.

Given that this error hasn’t occurred for EXA before, and that the files were located within EXA’s own data directories, I thought it may be an issue on Magic Leap’s side. At bootcamp, I asked the technical staff about it — we looked at some code together, tried setting various permissions, and even found a not-quite-what-I-needed workaround. The staff were all very helpful, but we couldn’t solve the issue.

After returning from bootcamp, I continued searching for clues that might help. Eventually, I realized that opening a FileStream with a FileMode.Open parameter will request both “read” and “write” access for the file by default. Providing an optional FileAccess.Read parameter, however, will request “read” access only. After everything else I’d tried, this was all I needed to do to resolve the file-loading errors.

Once I could access files, I also spent some time failing forward with errors related to JSON parsing. When running on the device, my app couldn’t seem to locate the simple classes that represent the JSON data structure. I learned that, because building for the Magic Leap One uses IL2CPP, the builds exclude methods and constructors that are only accessed via reflection. So, to fix the parsing errors, I needed to actually use the data class constructors in the code, as well as constructors for a few flavors of List<T>.

Native Audio

EXA’s native audio system consumed more time than all the rest, but I’m exhausted just thinking about it, so I’ll keep it short: hours and hours of tweaking audio buffer sizes and sampling rates and float-to-byte-to-int16 conversions and buffer callbacks and compilation options — all combined in various ways until each new piece would work.

After listening to more choppy, garbled, or (worst of all) silent audio than I care to remember, I got it all working. The big picture: I used the Magic Leap C-based Audio API, integrated it within PortAudio as a new audio host, and compiled it all via MABU script (similar to a “make” file) for Magic Leap’s Android-based operating system.

Rather than describing my development slog, here are some of the key things I learned along the way:

  • The equivalent to a DLL for Android is the “.so” shared library file. A static library gets an “.a” extension, but this didn’t seem to work with Unity. This was unclear to me at first, and when nothing else was working, it was helpful to know for sure that an “.so” file is what I needed to generate.
  • If a DLL was originally named “CustomAudio.dll”, the Android version should be named “libCustomAudio.so”.
  • For Windows/DLL, you use __declspec(dllexport) to make your shared library functions available for interop. For Android/SO, that won’t compile, but you can use __attribute__((visibility(“default”))) instead. I use an #if defined(__ANDROID__) conditional to switch between them.
  • In Unity, for any native libraries in the project, set the DLL version to be excluded from Magic Leap builds, and the SO version to be included only in Magic Leap builds.
  • MABU scripts are tricky at first. The “INCL” list points to header directories, while the “SRCS” list points to individual source files. Those paths can be relative or absolute, and can use the “$(MLSDK)” macro for SDK paths. Use a backslash (\) at the end of a line to make multi-line sections. The combination of “USES = ml_sdk” with “SHLIBS = ml_audio” is what worked for me (not placing both in the “USES” list).
  • There are some example MABU scripts in the Magic Leap SDK folder that provide lots of useful comments.
  • To find all the available flags for the “OPTIONS” list, use [mabu - -print-options] on the command line.
  • The template MABU file includes “stl/libgnustl” in its “OPTIONS” list. This caused a runtime error for me — something about how it couldn’t find “libgnustl”. Removing that option from the MABU file fixed the error, and the compiler didn’t seem to care either way.
  • Use the Magic Leap extension in Visual Studio 2017. It offers template projects for programs, shared libraries, and static libraries. Each one includes the basic project structure, a MABU file, and compilation settings. Build for the “Release ML” target in Visual Studio.
  • There’s also a Magic Leap extension for VS Code. I used this on Mac, and it seemed to work well enough. I wasn’t aware of VS Code before starting with Magic Leap — it’s like a simplified version of Visual Studio.
  • The MLAudioGetOutputStreamDefaults function, when given a particular audio format (including sample rate), provides the minimum and recommended buffer sizes. MLAudio accepts audio sample rates between 16,000 and 48,000Hz.
  • The MLAudioCreateSoundWithOutputStream function’s buffer_size parameter must be provided as a size in bytes, not as an array length. This was the source of many problems for me, which may have been avoided with a parameter name like buffer_bytes. I kept running into issues where my buffer sizes were off by a factor of two, and couldn’t figure out why. Eventually, I traced it back to this parameter. I use int16 values for my MLAudio buffers, which require two bytes per value, and so I needed to double the size provided to MLAudio.

Milestone Reached

Cautiously optimistic, I place the device on my head and wait for the build to complete. The app starts, I see those old physics cubes that I still haven’t removed, and the instruments begin to load. Seconds later, the bass-playing robot pops in and starts to play.

The instrument’s strings vibrate and glow, colorful trails follow the robot’s mallets, the robot’s motions are accurate, and crisp, clear audio flows from the device’s speakers. More robots appear as the song builds —with drums, then sustained synth notes, and finally, the piano melody.

For the first time, I can see and hear my virtual band, floating around me, in augmented reality. It’s a great feeling of relief; a sweet moment of success after a difficult few weeks. I let out a rather loud “whoo-hoo” from within my musically-augmented office, and watch the band for a while.


It hasn’t been the most glamorous few weeks. Failure isn’t usually very pretty, after all, and there has been a lot of it. With so many new things to learn, and with EXA’s specific needs ramping up the complexity, there were plenty of obstacles to trip me up. And they did.

But, I made it through. Each misstep taught me something new, narrowed the solution space, and guided me in the right direction. I learned fast and failed forward, over and over, all the way to my goal.

And now, with some of EXA’s biggest technical challenges behind me, I can start to think more deeply about the design, interaction, and experience challenges that await. Perhaps I’ll need to fail forward through a hundred design iterations on my way to an augmented reality version of EXA. I hope not. But if that’s what it takes, fine. I’m here to build something great.

Zach Kinstner is the founder of Aesthetic Interactive, where he works as a software dev/design consultant and builds in-house apps, like EXA.

@zachkinstner | @EXAmusicVR | dev videos | music videos

Zach Kinstner

Written by

Merging software dev with design. UI, UX, interaction, creativity, data-viz in VR/AR. Created VR music instrument @EXAmusicVR and VR/3D UI tool “Hover UI Kit”.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade