I often forget that I have crappy eyesight. I forget because I wear pieces of plastic on my face that correct my crappy eyesight, I very rarely deal with how bad my eyesight really is. Hearing aids, crutches, contact lenses, all augment a sense with extension of that sense feels more natural to us. They feel human. This is a mode of interaction that relies less on an interface and instead more like a stream of data that the user can attend to or ignore. It’s not something that demands your attention and focus but instead is a gentle input of yet more information about the world around you. In this way the human brain processes this sort of input more like a sense, unconsciously integrating it with multiple other senses to create a picture of the world, than a discrete action human-computer interface. It expands what neurobiologists call the “umwelt”, the world that can be detected and perceived by an organism. This isn’t a new idea, as far back as 1969 Paul Bach-y-Rita showed just how far neuroplasticity let us expand our worlds. You want an ambient awareness that blends into the job of figuring out the world around you that you and your senses are already engaged in. We’ve been thinking a lot about how augmented senses can improve or fill in the gaps in our sensory model of the world to make our lives safer and easier.
There are a few projects that give us super-human senses, that are pitched explicitly as being more-than-human. One of our favorites is Chris Woebken and Kenichi Okadas Animal Superpowers from 2008, a series of devices for children to give them the abilities of animals. Their Bat Vision Goggles give users the ability to see in the dark by hearing the pings of ultrasonic echolocation, much as a bat does.
Their Ant Apparatus allows you to feel like an ant by magnifying your vision 50x through microscope antennas in your hand.
These augmented senses are meant to give you explicitly other-than-human experiences. Another common usage of augmented senses is in creating assistive technologies: Andrew Spitz, Ruben Van der Vluten, and Markus Schmeiduch created an iPhone peripheral interface called Blind Maps that gives a tactile map for sight impaired users that can be read with the fingertips.
At Teague we wanted to explore augmented senses in a situation that a lot of us here experience every day: biking in a city. We’ve been experimenting with radar for cyclist safety, working on a device that we (cheekily) call the Reardar.
The goal of Reardar is to provide cyclists with a sixth sense of cars. Cyclists — especially those in cities — are constantly assessing and re-assessing their surroundings, maintaining a dynamic, mental map of their environment. This assessment relies heavily on sight and sound, exacting a tax on these senses. Reardar provides an additional sense of nearby cars to the cyclist through haptic feedback.
This gives the cyclist an increased — or more accurate — awareness and confirmation of nearby vehicles. It also allows the cyclist to offload some of this awareness budget from vision. If the cyclist has an adequately low-fidelity sense of the car(s) behind her, she can dedicate more of her eyesight budget to the primary focus of what is more immediately in front of her.
Since radar reflects off of metal but goes right through fabric the sensor can be embedded in multiple locations, in a bike light, a helmet, a small device that fits in a back pocket. We experimented with a few different versions of how the sensing capability of the Reardar might fit into a cyclists arsenal of tools.
The radar that we’ve been testing with only provides the speed of an oncoming metallic object and while might not seem like a lot, our thesis is that having a mental model of what’s around you in 360 degrees makes a cyclist safer and more confident.
The current prototype is concerned only with cars approaching from behind, so we’ve designed it to be particularly light weight — both in form and behavior. We’ve been working with off-the-shelf radar modules to get a sense of how the technology could work and what it means to feel the world around you as you cycle in a city. The core concept behind Reardar ultimately is about giving the cyclist a comprehensive sense to of cars in all directions and trajectories. Of course, this begins to rely on heavy infrastructure but that is coming.
In the meantime, we at TEAGUE are continuing to experiment with small light radars and radio sensing as well as ways of giving cyclists an information feed that they can use while they navigate city safely. More generally we’re exploring how augmented senses can make us safer and more aware and give us a richer experience of the world right around us.