Of Holograms and Hackathons: An Interview with Qoyn

Ira Brooker
Best Buy Developers
5 min readMar 23, 2015

--

We spend a lot of time developing and perfecting our APIs, and we’re pretty confident that we’ve created an exemplary product. But believing in your own work is one thing, seeing it in action is quite another. Our trip to the LAUNCH Hackathon a few weeks ago provided us a fantastic opportunity to witness exactly that. More than a dozen teams of hackers put our APIs to the test with some fascinating and innovative results.

Qoyn team members collect their prizes at LAUNCH.

The top prizewinner was Qoyn, a technology that allows shoppers to preview purchases by sending 3D holograms of items they scan in-store or select online. We spoke with developer Yosun Chang from the Qoyn team about using Best Buy’s APIs, the hackathon experience and the potential of 3D modeling.

Can you explain a little about what inspired you to create Qoyn and how you see it being used in the future?
Even FedEx same-day delivery takes about as much time as it takes for the average American to travel to their nearest Best Buy store (~2 hours). I wanted to be able to teleport things — but, barring the lack of certain breakthroughs in theoretical physics, sometimes I might not have the actual space or need for the actual thing I’d like teleported. 3D holograms that are an exact visual match of the product seemed the next best match. I see Qoyn as being the future of shopping, both in-store and online, where customers will get the fullest possible experience of the product, before purchasing the actual thing.

How would you see Best Buy customers specifically making use of Qoyn?
As a Best Buy Elite Plus member myself, here’s how I might use Qoyn:

On BestBuy.com: I would like to scan a product on the site with my phone to be able to see it as a “hologram” in 3D augmented reality on my desk, in my home, on my hands, on TVs, posters, walls, etc. For example, I might want to buy a large flat screen TV, but want to see if it will fit the size of my wall or how it will compare next to my couch. Using Qoyn, I can easily see what the item will look like before I buy.

In Best Buy stores: For products that are still in boxes and without assembled or display demos, I’d like to be able to point my phone to a box to see a 3D hologram of the object. (For customers without smart devices, Best Buy could offer tablets for use in-store for product previews.) For products that may require delivery and self-assembly, I might like to visualize how it would look in my living room, for example. Using similar augmented reality/virtual reality fusion technology, we could superimpose the 3D model of a Best Buy product onto a pre-recorded video of the room. Qoyn could also help create a virtual showroom where customers can browse a wide array of products within a smaller physical space.

Has your team participated in many hackathons? How did the LAUNCH experience compare?
I’ve participated in hundreds of hackathons, as have most other members of my team - though usually working on different projects.

LAUNCH is one of the largest hackathons in the world, and continues to try to provide an excellent experience for all parties involved. Unlike other hackathons of this size, LAUNCH tries to give everyone a fair chance by facilitating individualized judging - although this means that not everyone gets their time on the keynote stage, this avoids a common fault in more streamlined hackathons where AV issues and inadvertent delays deter the participant from being able to properly demo their product in the allotted two minutes onstage.

Also, unlike other hackathons, LAUNCH is designed to launch companies. Thus, LAUNCH hackathon specifically requires a team of four participants for entry. My usual schedule at a hackathon involves focusing purely on the hack (usually after pivoting a few times), but this team requirement of LAUNCH has meant trying to juggle delegating tasks to the extra team members - which, despite its complications and stress level under a 24-hour time-limit, seemed to work out well after the storm of creation!

I’ve personally participated in LAUNCH for the past three years, always with a hack involving 3D that could be labeled as both virtual reality and augmented reality. For me, hackathons are about presenting cutting-edge software - in past years, my hacks included ShelvesOS, a motion-gesture operating system and gyroFIRE Platform, an IDE/virtual simulator for Google Glass. The top entries typically have a solid current business use-case, so in past years, no matter how cool or innovative they were, 3D hacks were seen as toys, unsuitable for the main stage. I’m glad to see a few of the top 15 being 3D hacks this year!

How easy or difficult was it to work with the Best Buy APIs?
It was very straightforward to find top-down photos of the cylindrically-shaped product we procedurally modeled by projecting the top photo onto a cylinder mesh. (However, that’s just an intermediary solution. See my next answer.)

Do you have any suggestions on how we could improve the Best Buy APIs from a hacker standpoint?
I believe that Best Buy has the employee force to be able to 3D scan all of its products to provide 3D models for augmented reality and virtual reality developers to make some next-gen innovative awesomeness with! All we need is a .fbx (or 3D model in another popular format) associated with each product, similar to how product photos in .png or .jpg are assumed by default. For each unboxed product, simply include the real world lengths (10cm x 15cm x 12cm for example) of a bounding box, and it would be possible to properly size them in virtual and augmented space.

Until then, I would love access to more photos — especially from non-orthogonal angles — of (unboxed) products to be able to generate 3D models from photos.

--

--

Ira Brooker
Best Buy Developers

http://irabrooker.com 'In the venn diagram of hipster fatherhood, freelance writing, the NBA & Archie comics, Ira totally owns the middle bit.' - @sighafstrom