I have always worn earphones, then about a year ago I bought some Bose wireless noise cancelling headphones which completely changed how I travel on trains or planes.
Then AirPods came along and created the wireless experience that I got from my Bose headphones but made them more of a convenience (I still use my Bose for travel or concentrating). Being able to wear just one AirPod at a time means I can part listen as well as be engaged in the non-connected world.
I calculated that I now wear an ear or headphone for 20–30% of my day, when you think about that, it’s so much more consumption than PC or Mobile, only being beat by a pair of glasses or a watch (and clothes) that I wear for longer periods of time.
Earphones will be a game changer in how we use voice computing to aid our day.
All started with guided tours
I have only done this once in Rome, walking around with a £2 pair of headphones on being guided through the history of an ancient and interesting city. Last week I was at the V&A museum in London and noticed a sign where you could download a guided tour through the European history section. Listening to the history of the art and physical objects did add a sense of engagement though I was still head down at my phone in parts due to stop and start audio as I viewed different pieces. Though this time not on £2 headphones but wireless AirPods.
Whenever you’re in a new place and need to navigate yourself around, out comes the phone and Google Maps, holding the screen up to the sky, rotating around to try and align the compass to understand where you are in relation to the six interconnecting junctions.
Apple brought a glimpse of hope with the Apple Watch and walking directions, giving you haptic feedback as you move, indicating you to turn left or right. Though it’s preciseness was not perfect, and there are times you miss the haptic alerts, this still resulted in the phone being removed from your pocket and pointing it to the sky to help navigate you around.
Imagine you are walking down the street when you need to turn right, you hear “Turn right in 50 meters” or take the “3rd exit”. You’re thinking, we have this already in our cars, and that is true, even our mapping services have audible walking directions but its crap.
What we now need is innovation within audible navigation, introducing EVMC enhanced visual mapping cues (made up name). Imagine walking down the street and hearing:
“Hi Lee, in about 5 shops time you’re going to be turning right, you should see a Costa on your left, go past the Costa towards the bridge ahead. When you get to the traffic lights opposite Tesco, please cross the road. Great, there is an alleyway between Valentinos Italian restaurant and the Red Lion pub, walk down there, and your friends are near the garden entrance of the brewhouse pub.”
Of course, this relies on up to date road layouts and business names, and though we aren’t there yet with the rise of Google street view cameras and geo-position social images, maybe this information is not far from being available to us all.
Though earphones are primary to an audio experience device, they have already started to change in their use case. Firstly, we added the microphone, then batteries to make them wireless and even their ability to cancel the noise around us.
But the earphone can offer us a lot more, and in a more convenient way, the inner ear provides us data about our bodies such as heart rate, temperature, respiration rate, PPG and even posture or tiredness. The earphone is going to evolve over the coming years and move the inconsistent data of fitness bands to accurate real-time insight from our ears.
When listening to spoken audio via an earphone rather than through a speaker, people can hear at a faster pace normally 1.3 times faster, enabling the consumption of information much faster.
Personally, after several months of incrementally increasing the speed I have found my optimum is up to 1.7 times, and in some edge cases, I can consume audio at 2.1 times though it can be tiring to listen to so has to be for a value reason and objective. This isn’t to say you should do this but if you find audio you listen to is slow, or if you want to consume a lot of information in shorter time, then earphone-based audio may be your answer.
Complementary to AR/VR/MR
When we move to an AR/MR world, we are going to finally stop looking down at our phones (hopefully) and start looking at the world again, having this additional overlay of information provided to us that you see in sci-fi films and even Silicon Valley start-up concept videos.
The way in which we will interact with this vast overlay of information is through speech and either through an earphone-based interface or bone conduction. Either way voice and audio are going to play a huge role in the future of visual computing.
Communal voice interactions give you the freedom of movement in your environment as well as the ability for multiple people to engage. The latest round of smart speakers have added audio identification so personal information can’t be incorrectly accessed by someone who shouldn’t.
Though even with this audio identification, there is some information you don’t want to be blasted across a room. If you take this to earphones, you gain an intimacy environment where information can remain yours and private from a communal audience.
Where this creates an enhanced level of security over smart speakers, it even steps it up compared to mobile or web as there is no screen people can overlook.
Will earphones become our next fitness tracker, navigation aid or intimacy place, what do you think?