Voice UI Can Reshape Your Workplace to Be More Accessible
An area that often gets short shrift in the design and development of new software is accessibility. It’s set apart as niche work and seen as an added, arduous expense. But it shouldn’t be. By its very nature, accessible software is user-centred software and user-centred software tends to mean great products overall.
As technology transforms our lives — both at work and at home — it’s become apparent that enormous gains can be made when we harness that technology for the betterment of all. Myplanet partnered with CNIB to do just that.
By creating a smarter interface that showcases how the latest developments in conversational user experiences and machine learning can be applied to existing applications to make them more accessible and ultimately, more usable, Myplanet and CNIB are expanding our understanding of not only how the latest in conversational user interfaces work, but how they can be reimagined to work even better.
“CNIB’s goal is to ensure that people who are blind and partially sighted have the support they need to lead full and active lives,” says Rob Hindley, VP Marketing and Social Enterprise for the CNIB. And with Myplanet’s mission of improving everyday experiences, working with CNIB on this project was a natural fit.
“We wanted to work with them on some of their office use cases,” says Jason Cottrell, Myplanet CEO. “To create something that would actually allow someone to complete a common, high-utility task using some of the modern voice assistants.”
Right now people have to rely on the often clunky experience of opening a program on a desktop and then using a screen reader to access the information in a piece of enterprise software. By streamlining that path of access to the information they need and the ways they can engage with it, both Myplanet and CNIB were hoping to increase access and opportunities for users.
Our two organizations worked together to create a voice-controlled calendar application — an app that can read and interact with a calendar through just the power of voice control. No tabbing through endless fields, no reams of meaningless information being shared ad nauseam. Just a simple, direct-action interface to connect people to the information they need.
For CNIB, this initiative was an opportunity to live their mission as well. Hindley notes that this project touches on all three of CNIB’s strategic imperatives: unleashing technology to empower people, increasing the opportunities available for people within employment and education spheres, and putting everyone on more level footing.
And as CNIB seeks to drive society forward in building and creating accessible products and services, this project serves as both a step on the road to increased understanding of how to achieve those goals and a potentially usable product that can begin to meet those aims right now.
Whenever we take on a project like this we know the end result won’t simply be the product we set out to build. The research at the outset will often shift our ideas of what we’re going to create and even when it doesn’t, the learnings we take on at every step bring us to a more comprehensive and holistic understanding of not just the problem we’re seeking to solve, but also of the ways in which we approach all problems.
It’s one of the things our teams enjoy the most about these new, leading-edge technical projects and after more than a decade of innovation-based projects, even Jason’s enthusiasm doesn’t wane.
“What’s fun about going through this exercise is it forces us to make very intelligent choices about exactly what we want and need the user to do in a system. And in doing so, we often find we can go back and drastically simplify the existing experience on web and mobile as well,” he adds.
It’s clear that Rob and the rest of the folks at CNIB feel the same way.
“Projects like these are things that we need to be involved in and want to be engaged in to help push forward in breaking through barriers that may limit somebody’s access or ability and to move society forward,” says Rob.
By simplifying this one task with this specific technology application, we make the work of doing that for other emerging technologies easier as well. As Jason notes, “Basically, if we make it simple here, there’s all sorts of other ways we can manage that moving forward.”
Having achieved what we set out to do in creating a proof of concept for a voice controlled calendar app, we’re now examining what’s possible for future projects with CNIB.
“Gesture-based technology, haptic feedback, even VR — we’re looking at all of those and how they can be applied and integrated into existing technologies so they can provide a high level of accessibility for people,” says Rob.
As we start work on the next iteration of this capability, check out the project in the video below: