The Fashion Project — Part 2 

a.k.a. iBeaconing before everyone else


I had gotten through an earlier version of an installation that relied on an overly complicated, and wholly unreliable RFID system for interaction. It was time to update it so that it could be installed anywhere. People would still walk up to a screen and have it recognize them, but to make the experience tight required architecting a new underlying system from scratch — all the front-end app, look, vids, etc. would stay the same. The problem I needed to address was how to do proximity sensing on devices that didn’t inherently have the necessary tech to figure out how far away they were from one another.

I was facing a slew of design constraints, but these were actually helpful.

Constraints

There were a number of constraints that were difficult to address, but having these concrete requirements made it a hell of a lot easier to research, brainstorm and make the right decisions before moving forward.

  1. Interaction had to remain the same — we had to work with the existing client-facing applications, graphics, etc.
  2. Multiple screens in a room needed to be independently aware of who was in front of them at all times.
  3. A person should connect to only one screen at a time.
  4. The setup of the installation had to be way easier than the RFID system.
  5. Responsiveness needed to improve — essentially, make the experience tighter in all respects.
  6. Need to use the latest tech — iPhone 5, or iPod Touch, they didn’t want to be running the installation with iPhone 4 / 4s.

Early Direction: NFC

My client was interested in using NFC, so I had to start my research there. After looking into it I found an immediate problem: NFC on iOS was difficult to achieve, you had to buy hardware to wrap around the devices. But, right off the bat this wasn’t enough of an argument to completely abandon the direction. I looked into a lot of different solutions and companies:

Some of my initial hesitations were addressing issue 4, in that the setup needed to be much lighter and easier than the RFID solution. Adding additional readers, chips and cases to the install meant adding additional complexity to the setup.

NYU + ITP

I had already planned to be in New York to teach C4 at NYU’s ITP Camp, and heading there I had to put My Runway on hold for a bit. This ended up being a good thing because while I was at ITP I saw a presentation about a new kind of RFID tag called Stick’n’Find that could be recognized by newer iPhones and give readings for how far away they were from the device.

Brilliant. This would solve the additional hardware issue (i.e. I didn’t need to have any) and I could use the latest iPhone / iPods.

As per this, I made the decision to fly in the face of NFC and build out a system that relied on Bluetooth LE.

Roadblocks

First, the new tech that I was going to work with had just been released as a developer preview which meant that Apple had given a talk about it at WWDC 2013 and shown “how” it worked — the issue was that no one had built anything with it yet.

The second issue was that because it was a developer preview, there was no documentation on how to build things with it — I had to learn how to use it directly from the header documents that were in the frameworks I was using.

Fuuuuuuuuuuun.

Stick’n’Find

I got a few SnF tags in early July and started working with them immediately. After a few hours of playing around with them they looked really promising. I updated the client and we decided that this was the way to go. It looked like almost all the constraints and issues would be solved:

This is what the tags look like... About an inch wide and a quarter inch thick
  1. Not a problem.
  2. Add a tag next to each screen to solve the problem (i.e. replace each RFID sensor with a single SnF tag).
  3. A handheld recognizes the closest tag and connects.
  4. Put a tag on it and go.
  5. (This is the almost).
  6. Not a problem.

Recognizing 4 Comps in one Room, Brilliant


The iPod in this vid is directly connected to 4 Stick’n’Find tags that are distributed in our office.

After a couple of days I had a tag at each of our computers in the office, one on each desk. I could walk up to someone and the screen on my handheld would show me their name at the top of the list.

The iPod in this vid recognizes but is not connected to 4 tags that are distributed in our office.

These SnF tags were looking extremely promising. Here are a couple vids of the basic recognition system working. With this demo app I was able to walk around our office and see which tag I was closest to.

Bang Your Head Against A Wall (#1)

So, after a week or so of dicking around with the tags I realized this: they were not designed for accurate proximity, they were not designed for responsiveness, the software to set them up was pretty crappy, they were not designed for being used in any other way than they were designed.

However, they were actually pretty slick little devices that were designed for you to add them to your things and know when you’ve walked away from them — a long way, like 5m. The software for setting them up was essentially designed for non-developers to add a tag to their key chain, name it, and know when they had left them behind. They did their job very very well.

But, they couldn’t keep up with what I needed.

My SnF Malaise

I slowly came to realize that setting up an installation with these little gadgets was going to be a pain in the ass. First, I kept losing track of which tags were on or off (they seemed to stop broadcasting sometimes). Second, I kept losing track of which tag was which (I think this had to do with the first issue, and was compounded by the fact that they’re little black objects with no visible identification). These two issues alone were enough for me to abandon using SnF tags.

Thinking from the perspective of a tech installing these things was making me worry. If I, the developer, keep losing track of the tags then an install tech will definitely have more trouble than me. I didn’t like the vision of seeing someone on the other end of this work, sitting in a room across the planet, struggling to make sense of which tag was which and how to calibrate the system — it just didn’t sit well with me.

Furthermore, thinking of the experience of a user, if the tags were randomly dropping in and out of the installation then it would “feel” really bad or, even worse, look like the installation didn’t work. This was almost the tipping point for me, but I had gone down this route of RFID v. NFC and couldn’t abandon my decision too eagerly.

Along Came iBeacon

My work with the SnF tags took some time, maybe about 2 weeks of focused struggle. During this time Apple held their WWDC and released some really fresh new tech, and after a couple days I started noticing new threads on the SnF support site. People were asking when the tags would support iBeacon.

Me: “Wtf is iBeacon?”

I looked into it. There was a section of one of the WWDC vids that talked about Bluetooth LE and iBeacon, but there was almost nothing written about it… They hadn’t released any documentation for it, there were no guides.

There were only the headers, a video intro, and a really really really RAW iBeacon project that Apple provided. On top of all that, iBeacon stuff was under NDA (to be released to the public only in Sept) so there was no way to ask serious dev questions about it in online forums. Also, no one had ever built anything with it, so there were no answers to be had… anywhere.

But, they looked amazing.

Bang Your Head Against A Wall (#2)

With a 15 min video presentation of iBeacons and access to iOS 7 (via the Xcode 5 Beta), I set out to build a new system that would use iPods as beacons, iPhones/iPods as handhelds that would range those beacons, and individual computers for each station that would communicate to specific handhelds using bluetooth. Simple.

Recap

So, the first iteration was done quickly and mostly worked but had some problems. Mainly, the issue was in the complexity of the setup and the reliability of the sensors: it was tough to set up and spotty tag recognition.

The original setup had a ton of wires and cables running between the various consoles, the sensors and the central server. There was complicated logic running on the server coordinating associating tags with devices, figuring out which device was in front of which sensor. And so, if the server went down, or a cable wasn’t plugged in properly, or a sensor was out of alignment, then the system would break.

Using iBeacons instead of RFID sensors cleaned up a ton of the complicated-ness of the installation. Wham. Much better. Instead of needing to network all the consoles through a server each one would now be stand on its own. The logic from the server shifted in part to the handhelds (i.e. they figure out what beacon they’re closest to) and the consoles (i.e. providing remote control from handhelds).

And Now, To The Banging

Going down the iBeacon route brought up a few complicated issues I had to deal with; not too complicated on their own, but all together they made things a bit stressful.

First, there was no real documentation. Apple had just released the iOS 7 Beta, so there weren’t any programming guidelines and no tutorials. The resources I had at my disposal were: a sample application, and raw header files from which I could read the engineers’ notes.

Second, the tech was under NDA. Because it was a beta release of the software, I wasn’t “allowed” to ask questions outside of the Apple developer forums (which can be iffy at times) so places like Stack Overflow were out of the question.

Third, aside from Apple’s engineers almost no one had built something for iBeacons before… So, there weren’t any answers to be had even if the software wasn’t under NDA and I could ask questions outside of the forums.

Fourth, I was running on faith. I had a good sense that this was the right approach, but without proof I was under the gun. I had convinced my client that this was definitely the way to go with the software decision for the installation. I knew there were going to be hitches and potholes along the way, but they gave me a budget to work within and I had to produce the installation as quickly as possible. I had to hope that I was right.

Delivering

I spent about six weeks actually coding the installation. I totally blew through the budget, but in the end the system was slick. Instead of a complicated interdependent system of consoles, sensors and servers, I was able to reduce everything to standalone consoles. This meant that you could have as few or as many consoles as you wanted without having to be concerned about plugging things in.

Final Setup

The final setup was clean and simple. I wrote 3 applications that would allow a versatile number of handhelds and consoles to be installed without having to worry about numbers, coordinating them, etc.

These three apps were:

  • Handheld app (iOS)
  • Beacon app (iOS)
  • Console app (Mac)

Console

The console application was basically the same as the original application. It would connect directly to a handheld when it was given a request. There was only a small difference in the logic it ran.

The original version would wait for commands to connect to a device from the central server. The new version allowed direct connection requests from a handheld device, so there was just a little bit of a difference in the handshaking code I had written.

A short clip showing how the console is set up by first choosing an ID

Now that the coordination was decentralized, I had to add a menu for specifying the ID of the console. This was pretty easy, I simply added a list of possible IDs to the app, allowing the user to choose this from the menu bar.

Beacon

The beacon app was dead simple to make, all it needed to do was broadcast itself. But, because I was designing it for an installation there needed to be a bit of UX to handle the following conditions:

  1. Ranging is best if the device has an unobstructed view to the handheld.
  2. The beacon needed UI elements for turning it on / off, and setting its sensitivity, and choosing its ID to match that of a specific console.

Given these conditions, I decided it would be best to place the beacon next to the console and have the iPod’s touchscreen facing outwards. This would make it easy for someone to install the device, set it up and check its condition.

A short video showing how to set up a beacon with an ID and a calibrated range

However, I didn’t want random people playing with the settings so I created the app with a main view and a settings view. A tech could switch to the settings view, set up the beacon, and switch it back to the locked home screen.

Handheld

Like the console, the handheld app needed only a very small amount of tweaking to get it running properly. First, I had to add beacon ranging code and logic; which was simple because it needed to be initiated only at app launch and run didn’t require any interface elements. Second, I had to add some handshaking code to the app so it could connect to a console. Third, once everything was set up properly I needed to send commands from the app to the console when the user would switch personas and choose specific elements in the app (this was pretty easy).

Demos

I wasn’t on-site to see the first live install, but before I submitted the software to my client I took a few documentation vids showing different use-cases. There are two, one with a single console and a single phone, and another with two phones and two consoles.

Single Console

There are 3 things to note: I’m using an iPhone as the handheld (which runs the main app), I’m using an iPod as the iBeacon, and the Mac Mini sitting on the desk is running the console app. I set up and calibrated all the software prior to capturing the vid.

A single-console demo with a single phone

Here’s what happens:

  • (0:04) All three apps are running.
  • (0:09) The user chooses the persona Bobby, and the phone starts ranging.
  • (0:16) The user approaches a beacon and the phone starts to connect.
  • (0:22) The phone connects to the console and triggers a response. At this point there is a solid bluetooth connection between the Mac Mini and the iPhone.
  • (0:31) The user navigates the app and triggers a change on the console.
  • (0:43) The user exits the current persona, the connection drops between the console and the iPhone. The console reverts to its default state.
  • (0:48) The user selects another persona and continues interacting.

This demo shows how a single user can approach a console and interact with it while navigating their app. It’s a good short demo, but the next video will show how the exact same system can be used for multiple consoles.

Dual Console

The setup here identical to the first demo but with two phones and two consoles. One beacon and one console are associated by setting each to the same unique ID.

A dual-console demo with two phones, two beacons and two consoles

Here’s what happens:

  • (0:00) Consoles are running in their default state, transitioning images.
  • (0:11) User 1 chooses the persona Marcus.
  • (0:17) User 1 approaches the first console and it connects.
  • (0:25) On a second phone, a User 2 chooses the persona Nancy.
  • (0:30) User 2 approaches the second console and connects to it.
  • (1:03) User 2 switches personas to Bobby and reconnects. User 1 remains connected to console 1.
  • (1:15) User 2 leaves console 2 and approaches console 1, their handheld disconnects from console 2. User 1 remains connected to console 1.
  • (1:20) User 1 leaves console 1 and approaches console 2, disconnecting from console 1. User 2's handheld connects immediately to console 1, and User 1's handheld connects to console 2.
  • (1:34) User 2 “walks” back to console 2.
  • (1:40) User 1 exits their persona, disconnecting from console 2.
  • (1:42) User 2 connects to console 2.
  • (1:50) User 1 remains near console 2 but is now “queued” behind User 2.
  • (1:55) User 2 “walks” back to console 1, and both users connect to their closest consoles.

This demo highlights the “queueing” system that is cooked into the consoles and handhelds. When one phone is connected to a console it blocks any others from connecting. It also shows how a phone will disconnect from a console when it gets closer to another beacon.

Fin

Just under a month before iOS 7 official release (and the release of real documentation) I had built a pretty sophisticated proximity-based installation using iBeacons — long before anyone else had built this kind of thing. Though it was painful building this from scratch to my client’s needs, I was able to get through it and we now have a pretty solid product that we can use elsewhere when we need.

I was able to satisfy the original requirements.

  1. Interaction remained the same — I didn’t change the user-facing apps.
  2. Each user’s phone ranged the beacons to figure out where it was in relation to the consoles — This logic is better on the handhelds than the original server computer.
  3. A person can connect to only one screen at a time — The connections are strong and persistent until they leave or disconnect by choosing another persona.
  4. The setup of the installation had to be way easier than the RFID system — Less tech, along with UI-based setup was incredibly easier.
  5. Responsiveness needed to improve — Once a connection was established it wouldn’t drop unless there was a clear action by the user.
  6. Need to use the latest tech — It used bleeding-edge beta technology hacked together from only having read the header files of the new framework released by Apple.

Done.

I was contracted for 60 hours, but this took closer to 160 with research, testing, failing, and so on… We own the IP, so I guess that’s okay.