Our relationship to technology hasn’t changed over the last 50 years
As the keynote speaker for day three of the Better World by Design ’15 conference, Alexis Lloyd asked us to ask a very important question: what is our deep underlying assumption about our relationship with technology? She opened the keynote with two videos: a “kitchen of the future” video filmed in the 1950’s. The audience chuckled at the naïve vision of the future from that period in time, when it was assumped that anything that can be automated, should be automated, simply because it was believed they automation = good.
The chuckle grew into a laughter when Alexis played the second video showing a representative from Whirlpool introducing the “fridge of the future.”
The resemblance between the two was uncanny. The two people in the videos were 50 years apart but echoed the same theme, that “technology is autonomous systems that we can offload tasks to.” Even though the state of technology has evolved far beyond what anyone in 1950’s could imagine, this underlying assumption hasn’t really changed in the last 50 years.
The kitchen is just one of many examples of people using technology to offload human tasks indiscriminately. Today, there are things like Lovely, a connected sex toy that tracks everything from calories burned, to number of thrusts, to the intensity of intercourse. The Indigogo page says
Based on your sexual activity history, movement data and personal preferences, our easy to use Lovely App provides helpful and fun recommendations to help you have even better sex next time.
And don’t forget Crystal, an add-on to your email service that shows you the “unique personality profile” of someone you’re talking by scanning through their online content to so you can craft your message in that person’s “ natural communication style.”
Sex and emotion are two of the things that humans are infinitely better than machines at doing. When you offload these tasks to machines, what room is there for humans to exercise their volition? While play gives you pleasure and work doesn’t, they both require your active input. A century ago when we had to break our backs working in order to survive, we dreamed of the day when we can lay around all day and not have to do anything. Well, that day has long passed us in the developed country and we are still working harder than ever because work keeps us from getting bored. But because our mindset is still stuck in the past, we are still designing machines that take work away from us while looking for opportunties to be actively engaged. What we really need are machines that emower us to discover new ways to interact with the world and interpret new kinds of meaning from our environment.
How NYT R&D Lab is changing our relationship to technology
The NYT R&D lab attempts to blend emerging technologies with emerging behaviors in hopes to discover new ways for people to collaborate with technology to create meaningful experiences.
Alexis proposes that in developing new technology, we should follow these principles:
- Have respect for users — people can many things better than machines can do give people agency where appropriate.
- Listen in the right ways —d
- Assist people in asking better questions to interpret new meaning from the situation.
Following these principles, the NYT R&D lab developed a visualization tool that enable journalists at the NYT to have a “conversation” with tens of thousands of readers thorugh the meaningful interpretation of big data aided by machines.
The lab then went a step further to create a open source piece of software package called Streamtools, a graphical toolkit that enables people to make sense and make use of streams of real time data like earthquake feeds from around the world, the sensors on trains in the New York Subway, and even text messages from mobile phones.
Another project at NYT was a bot that automatically looks through Reddit. Every time someone links to an NYT article and enough activity goes on in the thread, it pastes the link to a dedicated Slack channel and sends out an email to the original reporter. These technologies didn’t replace the journalist. They “created affordances for journalists to give feedback.”
One of the recent project that the Lab worked on was called the Listening Table, a table with digital techology seamlessly built in that can record the converations happening there and uploads the audio recordings as well as the text from voice recognition onto a web interface where they can be organized and managed.
Rather than hiding the technological components in the desk, the Listening Table intentionally made the technologies visible to show users what affordances are available. When people want to turn it off, they can simply flip a switch on the side.
The table was designed to mimic a human participant in the conversation —it was designed to “forget” after a period of time.
Contrast the listening table with LiveLight, a piece of video analysis software that recognizes and deletes “Repetitive, erroneous content” from videos so they don’t take up as much space. Heck, footage babies are some of the most repetitive and seemingly erroneous content there is, but that doesn’t mean it’s not meaningful to the mother.
Over the last few decades, we have measured the capabilities of machines to how close can replace a human. We aspired to create machines that are smarter, faster, and more capable of doing a human’s job.
To create systems that delight the people interacting with them, we need to shift our fundamental assumption about our realtionship with technology from offloading to collaborating. We need to move from creating smart systems that perscribe to human system that suggest and give humans the agency to take part in shaping their experience.