For the past month, Nate Parrott and I have been prototyping an augmented reality table. You can read the technical details here (or down below). I’ll try to tell the story of why.

My most vivid memories from childhood came from capitalizing on my friend’s endless supply of Legos, and building.

(Not quite this complex)

We’d create elaborate landscapes, filled with castles, spaceships, trains, and airports. We’d fill these creations with stories, our living rooms playing host to massive battles, plastic metropolises, and gloomy dungeons. The opportunities offered to us by a few plastic bricks were endless and bounded only by our imagination.

As technology has progressed, the way we tell stories has gotten brighter, more dynamic, and significantly more complex. I can cruise around an accurate recreation of the entire galaxy, hunt down Rebel scum, and cruise around a photorealistic recreation of California, straight from my computer screen. And starting this year, I’ll be able to do the same things in an immersive virtual world.

But even as our storytelling horizons have expanded, they’ve become constrained. In their quest to tell cinematic stories, modern games are heavily scripted, and elaborately designed. You might be immersed into a story, but at the end of the day, someone else is telling it. Games are perfect little universes, but as a trade-off, locked up, and incredibly difficult to change.

These stories don’t come cheap. The cost of developing a modern AAA game is upwards of $50 million and requires a development and art staff numbering in the hundreds. Other platforms, such as movies or books, are equally as crowded and expensive.

In the face of all of this, the spirit of exploration is still present. Games like Minecraft, and Terraria capture some of the creative spirit of physical play — “open worlds with simple pieces that obey simple rules” — and their unexpected popularity prove that people are still incredibly interested in such a premise. These games take it a step further, allowing interactions that would be difficult or impossible in the real world — like physics, logic, and the creation and destruction of material.

But compared to physical objects, these games are inaccessible — trading the natural, tactile nature of plastic bricks and wooden blocks with a hallucination in a box, mediated by mouse and keyboard.

Physical, Meet Digital

It’s this environment in which we present Light Table — a digital table that can understand and interact with physical objects placed on.

Early, early, prototypes

We threw together a webcam, projector, frosted acrylic, some wood, and an unholy amount of tape into a table that understands and reacts to objects placed onto it.

Our goal is to provide a platform for open storytelling and exploration that combines the best of the physical and digital worlds into a hybrid object. Embracing the spontaneity and tactility of the real, the table understands any old physical object — no tagging or special materials required — and gives life to it, harnessing the easy mutability and complexity of the digital.

This is rather abstract, and the exciting part of new interaction spaces aren’t the raw capabilities or constraints, but the new scenarios which they unlock — so we envisioned and built a couple of demo experiences.

The first was a rather boring HTML5 canvas debugger that drew outlines and labels around every object you threw on it. It was an exciting tech demo, but it wasn’t that interesting otherwise.

More complex and useful was our second demo — a functional simulation of Brown University.

We 3D-printed a couple of on-campus buildings and built an agent simulation of students, faculty, and staff running around a campus players create by placing buildings onto the table.

(A very early build of) Brown University Tycoon — on GitHub here

Agents in the simulation need to satisfy needs — students, for example, have dorms and classes, and they need to eat, sleep, study, socialize, and exercise. Players could see how adjusting the placement and composition of campus buildings changed the effectiveness of campus, while minimizing disruption to the surrounding city.

Even with a fairly simplistic first build, it’s easy to see how simulations like Brown University Tycoon could be used to actually help people tell stories, brainstorm, and solve problems (not to mention it’s pretty fun to see EMS running around treating drunk students).

Next Steps

We built Brown University Tycoon in the 12 hours that we had, but that’s just the tip of the iceberg — we’ve brainstormed tons of ideas for applications that’d benefit from the table’s unique interaction model:

  • Data-driven sports (reviewing football plays, simulating baseball, etc)
  • Physics simulation — gravity, fluids, solar systems
  • Simulate ecosystems — play bodies of water, trees, bushes
  • Participatory Mapping
  • Collaborative whiteboarding
  • Flood simulation
  • An interactive “sandbox”
  • Tower defense
  • Chess
  • D&D
  • DJ or create music
  • Giant space battles
  • Complex flowcharts

The table’s interface seems ideal for applications enhanced by natural, tactile, and spontaneous interactions, especially in collaborative environments.

Several hardware enhancements may also be made. Most importantly, in our first prototype, the interaction is one-way — digital entities can be influenced by physical objects, but not the other way around. But we’re envisioning ways for the table to manipulate physical objects in turn (electromagnets, anyone?).

Light Table started as class project, but having experienced a taste of its capabilities, Nate and I are interested in pushing it further. Our next steps will be to flesh out specific experiences to target, work with actual users in such fields, and to further prototype the product itself.

Think Light Table might be useful for you? Drop us a line!

These took FOREVER

HH Design is a community around design in the context of technology.