AR @ wehkamp
June 5th, 2017 — McEnry convention centre San Jose
WWDC had just began. During the keynote, apple relased ARKit, which overnight became the biggest AR platform in the world.
At the time, I was 15 and somewhere in the keynote audience as a scholarship winner. As you might expect, I was quite excited and couldn’t wait to try it. Unfortunately, at the time I didn’t have a device with strong enough hardware (iPhone 6s or newer and iPad Air 2 or newer) so I couldn’t see for myself, but there were plenty of other people and scholars who were able to test the apps we threw together.
As a bit of a backstory for me personally, at the time I was in VWO 4 and I hadn’t had a great year to say the least. School has been a “problem” in my life since I started when I was 4, and at this point, my “bucket was about to overflow”. I was not feeling well, literally getting sick of just thinking about school and getting unhappier every day. The one thing that kept me going was programming. I started iOS Development in 2015, released my first app a few months later and as of today, am still learning new things every day. WWDC was one of the coolest weeks of my life, meeting loads of new people who all had the same interest as I, iOS Development. I started feeling better, but once back home, I fell back in my old “school sickness”. At this point, me and my parents decided to slow down. I started to be at home more, where I could study iOS Development and follow my passion. Than, summer came around and during my summer vacation I bloomed, I felt happy again, got a vacation job, made some money, in short, all went well.
I started my new year of school (VWO 5) with good hopes, which lasted about 2 days. Within 2 weeks I was at home more than I was at school and within a month I didn’t visit school anymore at all.
During this time, we had multiple meetings with school about “now what”, I couldn’t go to school anymore, but just being at home all the time was not OK either. Long story short, in the end I asked if it would be ok to start an internship as an iOS Developer to expand my knowledge and not just sit at home all the time. School was okay with this, and I reached out to a few companies in the area. One of them being wehkamp. About 6 months later, in December 2017, after some meetings I officially started as an iOS Intern at wehkamp for 2 days a week.
Because this was an internship I came up with, and not required for my education, there were no tasks I had to complete or assignments I had to do. Wehkamp and I got complete freedom to decide what I’d be doing there. A project that had been laying on a shelf for a while due to priorities at other points was AR. So what we decided to do was that I’d start playing around with AR to create a so called POC, Proof of Concept. At wehkamp, we have our own collection of living products and we decided it’d be great to show those in AR.
Having decided what we want, I could start doing/trying things. Starting with simply being able to detect a surface, ending with dynamically loading in models from the web and remapping textures. So, let’s go a little more in depth. At the start we had to figure out the basic stuff. I had about 3 days of experience working with ARKit, and that was just the super basics. I also had next to no experience with SceneKit which is ARKits foundation. So for this first part, I looked at Apples documentation and example code a lot. With this, I was able to get a pre-loaded model to rotate and move, which at that point, was super amazing!
So, for our final app, we wanted more than the 4 preloaded models, so we got a third party involved to create those for us. This worked out great, but we had one problem. The files were delivered as DAE files, Digital Asset Exchange, which is a popular 3d modelling format. The problem here, is that those files aren’t suited for mobile devices, and we had to convert them to SCN files, SceneKit Node files, which is a format by Apple and works with mobile devices.
Our next hurdle was where to store the models. We had our models and now we needed to store them somewhere, storing them all on device didn’t seem like a good idea for multiple reasons, most obvious one being storage space. So, we decided to store them online, which worked fine, with one major issue, how were we going to get the models and show them on the phone? To solve this, we created some functionality to download the models from our servers, and render them. This took a while to figure out, but we got it to work in the end. Than, the next issue appeared, or, actually it didn’t, which was the problem. During the process of converting the models from DAE to SCN files, all texture file path’s were messed up, so where a texture would’ve been pointing to
../ it was now pointing to an absolute location on my mac. When rendering the model on the phone, nothing was found at that location because, well, it wasn’t my mac, so the location didn’t exist. I hoped that someone else had this problem too, so I could just copy paste in a solution and be done with it. But that wasn’t the case… So what we ended up doing was writing a custom remapping function that remapped each and every texture file from their weird URL to a correct one on device.
So now we had a textured 3d model in our room, we were able to move it around, but we still had some issues. First and foremost, everything looked like it was an inflatable, reflective plastic model. To say in short, they were ugly!
It took some time to figure out what this was, and why they were this ugly. It turned out that we had to blame our lighting. The way we set it up, we were lighting every single bit of the model at the same rate and intensity, which means you can’t see depth, nor shadows.
We saw a number of ways to fix this, I’ll explain 2 of them below. Solution one was to add a floating light source a meter above the model, this would light the model from just 1 place, cast and render shadows and add depth. We tested with this and models looked great, but encountered one big problem. Having multiple models would mess things up because the lights would overlap and cast really weird shadows.
The solution we ended up using, was not fixing the light to the model, but to the person. Or rather, to their point of view, which is equivalent to the users phone. This means that if a user moves their phone, the light moves with them, so we have shadows, depth, no overlaps, and the models are nice and light up from wherever you look at it.
The only thing left now, was to take what we learned during the building of the POC and take it to good use in our real app. Since this process consisted of mostly re-creating the POC with a better looking design, I won’t cover that here :)
So, that’s it for the technical part. Now, for a small update of what happened to me since I started my internship in December. First off, I’m not an intern anymore, but an employee. Next to that, after a lot more meetings with the involved parties, I’m officially done with school so I can follow my programming passion.
Thank you for taking the time to read this story! Feel free to leave your comments and ask questions! If you want to experience AR for yourself, you can do so here.