Talk presented at the annual UX Conference South Africa on 11 November 2017
Internet of Things (IoT), also called SMART technology has taken the tech world by storm the last couple of years. Internet-connected devices are being used in innovative ways to improve quality of life for all people. IoT potentially offers a revolutionary, fully accessible and “smart” world by improving interaction between objects, their environment and people. While the main focus of IoT is to make the world smarter, it can only be achieved if the right information is accessible at the right time, through technologies such as Artificial Intelligence (AI), Augmented Reality (AR) and Assistive Technology (AT). The core principles of these futuristic technologies are buried in the history of UX throughout the 19th century and by going back to the basics, we can create a “smarter” new world.
So, where did UX actually start?
The origins of UX can be traced back centuries, but it was only in the late 19th century that UX started to mold into a distinct discipline. In the early 1900s an American engineer Frederick Winslow Taylor pioneered the second Industrial Revolution, also called the Technical Revolution, with the concept Taylorism. This movement was all about efficiency - how workers interacted with their physical tools to get the job done in the most timely manner. The focus with Taylorism was on the tools and actions to get the job none, not on the person.
Fast forward a couple of decades to the 1940s, the first Human Centered Production System was developed by Toyota. This socio-technical system focused on the interaction between humans and technology. One such an example is the Andon Cord (“Andon” is a Japanese term that translates as “signal”), that was installed in the manufacturing factories of Toyota. Any employee had the authority to pull the Andon Cord to stop the assembly line immediately to address any issues that may arise. Toyota’s philosophy with this new system was to include people as part of the improvement process, thus bringing back the human factor that was lost with Taylorism
Not many people know of the classic texts written by Henry Dreyfuss in the 1950s. Both Designing for People together with The measurements of Man and Woman : Human factors in Design were profound in the sense that it explored the approach on how to design for people with different sizes and abilities, to effortlessly use a service or product.
This approach from Dreyfuss, highlights the importance of accessibility. Note that he didn’t exclude differently abled people but instead included all people.
During the 1970s the Graphical User Interface (GUI) and the computer mouse was developed by a Xerox research center called PARC (Palo Alto Research Center Incorporated). This kickstarted the Human-Computer Interaction (HCI) development which was established as a discipline in the 1980s. HCI focused on computer science and human factors engineering, but has grown exponentially the past couple of decades into a broad discipline that focuses on a variety of specialties, such as ergonomics, sociology, cognitive processes of human behavior, accessibility, and human interface design, to name just a few.
These milestones are just a high-level overview of how UX evolved into the rich discipline it is today. However there are always 3 core principles that manifest throughout all the phases in the history of UX.
The “new” era of UX
Within the field of UX, new buzzwords pop up weekly. Anticipation Design is one of them, it might sound like a new buzzword but it’s not a new concept at all. It goes way back to the 1990s, it’s just the way it has been implemented that has evolved over the past decades. Today the implementation is more sophisticated with the data and insights of smart technology. Remember Clippy from Microsoft Office? His main purpose was to assist the user and improve their experience by “predicting” what they want to do. Today we have complex chatbots that can have fairly helpful conversations to help the user reach their goal.
Anticipation Design can only be effective when there’s a perfect balance between Smart Technology (IoT), Machine Learning (ML) & Predictive UX design. Smart technology is used to interact and collect data from the user, which is then interpreted by machine learning algorithms. Predictive UX Design makes the users experience as seamless as possible.
Smart Technology wants to make the world smarter and more accessible. It sounds pretty simple right? All you really need is a “tech device” and some useful predictions from the Machine Learning algorithms. Not entirely true. It’s a little more complicated than that. Although this article’s focus is not on how to achieve this smarter new world, the following points are worth highlighting:
1. Extended Intelligence, not AI, is more acceptable
Joi Ito, the director of MIT Media Lab, said in an interview with Joël van Bodegraven, that we should rather focus on Extended Intelligence rather than generalising Artificial Intelligence. Us as humans, our nature is to use technology as an extension of ourselves. To use robotics to replace our humanly actions and functions would be inhuman.
2. Context is everything. Think before you predict
The main drive for e-commerce is to offer the user the same or similar product they’ve been looking (or bought). Think about it…
One example is I bought a pair of heels for a year end function (I’m more of a sneaker girl). After my purchase I receive newsletters in my inbox with promotions on heels, while I browse the internet I’m prompted with advertisements of a similar pair of shoes, in a different colour. Another example, I bought a gift for my mother online for her birthday, now I’m prompted with products related to this gift across all digital channels.
We need to predict smarter, more intuitive and in context. It’s also common for algorithms to misinterpret a user’s actions, especially if the model is still learning from the user’s interactions. It is thus critical to ensure the user doesn’t make unnecessary mistakes which can negatively impact these proposed behavioural patterns.
3. Be transparent, don’t pretend AI is human
This seems to be the focus — to make AI as human as possible. You get extra points if an actual human cannot tell the difference. We’re still in the process to reach this point. So, in the meantime, don’t pretend AI is human.
Other than the fact that there are ethical concerns around this approach, which I’m not going to get into with this article, by pretending AI is human can result in the user having a higher expectation of the interaction. Thus if the interaction does not unfold as expected, the user will become unnecessarily frustrated instead of enjoying their experience.
When the interaction with AI reaches a point where the user gets stuck, never make the user feel responsible for the issue. For example, if something goes wrong, apologise to the user and try to simplify the communication by saying “I’m sorry, I’m having some difficulty. Would you prefer option A, B or C in this context?”. Instead of asking open questions rather ask closed questions to simplify the interaction.
Smart Technology & Accessibility
When we think of Artificial Intelligence (AI), the mental image of I, Robot and Ex Machina comes to mind. It’s all about creepy robots that can turn against human kind at any moment. On the other hand Assistive Technology (AT) is associated with clunky 1980s looking personal computers. These mental images are distorted and does not represent the reality.
We’re moving into an age where these two “polar” technologies combined together, not only change our mindset that Assistive Technology is only for differently abled persons, but also offer groundbreaking changes in the way we all live, irrespective of our abilities.
There are some very innovative tech projects that focus on combining Smart Technology & Assistive Technology to improve the lives of all people.
SuitX’s robotic exoskeleton
SuitX, a company in California has developed a robotic exoskeleton called Phoenix that’s powered with a battery pack that lasts 8 hours. Phoenix can help with the rehabilitation of patients with spinal injuries and strokes, it can also be used to reduce the physical impact of daily tasks for elderly people.
Accessible smart cities
The definition of a “smart city” is an urban development that integrates Information and Communication Technology (ICT) and Internet of things (IoT) to make a city more accessible and efficient. One of these initiatives is to expand the city’s infrastructure and install fibre connections to make facilities such as distance learning and tele-medicine more accessible and ultimately improve quality of life.
In Singapore, older patients are assigned their own personal tele-nurse, who checks in with them on a regular basis through conference calls. Each of these patients also have a personal health tracker that reports back on the patient’s health. Another interesting example is the urban project called the Array of Things (AoT). AoT is a network of interactive, modular sensor boxes that have been installed on lamp poles across a specific area in Chicago. These sensors collect real-time data on the city’s environment, infrastructure, and activity by monitoring the air quality, sound and temperature, and can be used for research purposes such as a study of the relationship between diseases and the urban environment for example.
This is a unique, and controversial for some, approach to combining Smart Technology and Accessibility. Biohack grinders apply the hacker ethic to improve their own bodies with cybernetic devices. We’re no strangers in using smart technology to to monitor our steps, heart rate and track the number of calories burned daily with fitness trackers. It’s become quite the norm, but biohack grinders take it one step further by surgically modifying their bodies using Smart Technology to make the world around them more accessible. Biohack grinders are basically the real life cyborgs.
The Human Antenna
Artist Neil Harbisson was born with Achromatopsia (ACHM) which means he’s totally colour blind. He started the Human Antenna experiment to extend his perception of colour beyond his limited sight of grayscale only with software capable of translating colors into vibrations. He had an antenna surgically inserted into his occipital bone. The colors he experiences vibrates through bone conduction, which is the conduction of sound to the inner ear through the bones of the skull.
RFID and NFC implants
These implants are are becoming more popular by the day and is used for basic tracking and accessibility. A chip is inserted on the back of the hand, directly between the thumb and index finger, or along the wrist. It serves as embedded identification for the host and can be used to replace keys and passwords. The person will be able to start a car, access a building or log into any personal smart device using this chip.
Back to basics to build a “smarter” new world
It’s easy to get carried away with all the new ways technology is used to change the way we interact with the world around us. By focusing too much on the hype, it’s easy to neglect the core principles of UX: efficiency, accessibility and effortlessness. For most teams involved in building a product, efficiency and effortlessness is a given but accessibility always gets the backseat for many reasons. Our mindset that accessibility is only for differently abled persons need to change.
Accessibility is about making the right information accessible at the right time to all people, despite their abilities or limitations.
Hopefully this article helped to shift your mindset that accessibility is not only for differently abled persons and should in fact be non negotiable when building products. If you found this article useful, please share to help educate and raise awareness on accessibility and inclusive design!