Doing user research for the Digital Service Standard
Last week, we released the alpha of Ontario’s digital service standard, and we’re hoping to get your feedback and input on how to make it better.
As Honey mentioned, Ontario’s new digital service standard outlines a process to follow for designing good digital services. In designing the standard, we’re following the same process that we’re recommending.
As a user researcher with the Ontario Digital Service, and through my previous projects in the public service, I’ve previously worked on three major government standards: web standards, the online design program, and the integrated accessibility standards.
We’ve always consulted with the rest of government when developing standards. However, this is our most ambitious consultation to date. Before releasing the alpha, we’ve already met with key experts in the field of accessibility, privacy and open government.
We’ve also consulted with all the teams that work on the Ontario.ca website. For each principle in the standard, we asked four important questions:
- Is anything here not feasible? Why?
- What are the barriers for you to following this principle? How can we remove them?
- How do you feel about this principle? Do you think it will improve the quality of websites and applications?
- Is there anything in this principle that is unclear?
Asking these, and other questions, helped us to uncover lots of instructions in the standard that were unclear; we also got additional recommendations to changes we could make either in the standard or in the supporting tools and guidance documents.
We’re planning to address as many of the unclear points in our next update to the standard later this summer.
One interesting thing that we noticed in our user research: we didn’t get much feedback about the feasibility or barriers to each principle.
It’s hard for many people to guess what challenges they’ll face until they implement a process. Once we move into alpha testing the standard on real projects, we expect to get more information about what works and what doesn’t.
Now that the alpha is up, we’ve got more user research planned. Next, we’ll be interviewing potential users of the standard across the government who work in the fields of policy, programs, legal, digital and information technology. Using customized interview questions, we want to figure out how to best support these users in following the standard.
To design interview questions, we think about how each group would be affected by the standard and then create questions that can help identify barriers in their ability to meet the standard, such as:
- What are your feelings about working directly with users of your policy/program/law?
- How do you ensure you have resources for accessibility testing?
- Have you ever built a prototype of a digital product?
During our communications and outreach, we asked for volunteers to participate in user research and the response has been great. The more people who hear about the standard, the more who want to help us make it better.
Our plan is to update the standard regularly, initially in one-month intervals as we go through the user research sessions and then on an as-needed basis. We’ll look to changes in digital trends, other governments, changes within our organization and the state of our digital services to influence what changes we make and how often.
This standard will be more flexible than any I’ve worked on before. It’s a huge step forward for the government, where approvals and processes are often updated on an annual or even longer cycle. While there are reasons for that pace of process in certain areas, the creation of good digital services requires a more agile, user-driven, and responsive approach. We’re working to achieve that, starting with this very standard.