Shifting from compliance to creativity

Building Digital Service Reviews around learning through peer review

Jane Lu
Good Trouble
Published in
9 min readSep 21, 2022

--

This post is written by Sarah Ingle, Jane Lu, and Ashley Evans.

In our last post, we talked about shifting from planning to learning in pursuit of more agile governance.

Today, we want to share how we are designing digital service reviews to help teams meet digital standards and their users’ needs.

Multiple drop lights in front of a dark grey wall
Photo credit: https://unsplash.com/photos/NDLLFxTELrU

Compliance exercises are not the only tool for anticipating and responding to risks, being accountable to feedback, or encouraging responsible design practices.

Yet, many service design teams’ experiences show that compliance and project gating are often the main tools government organizations use to perform governance, strategy, or oversight functions.

Framing these functions solely as compliance exercises reduces them into a risk to be resolved or a box to be checked. We are often rushed into trying to resolve the risks or problems in order to alleviate the weight and discomfort of the process, instead of providing the time, space, or resources to understand their root causes.

It’s the difference between interrogation and curiosity.

As public servants, we serve people first and foremost. People are complex, messy, chaotic, and wonderful. Our lives take unexpected twists or turns, we change (a lot), and need help along the way. Services that are people-centred embrace and respond to our changing needs.

We want to build services like this, so we need to explore alternatives — processes, practices, and relationships — outside of compliance-based governance.

Alternative ways of building services that are trustworthy, responsibly-designed, and responsive to people’s needs; ways that also support learning and continuous improvement as people and their needs grow and change.

What does the policy actually say?

This approach is generally supported — in spirit and requirements, by Treasury Board Secretariat’s (TBS) Policy on Service and Digital and Canada’s Digital Ambition.

The Policy’s Section 4.2 on client-centric service design and delivery requires regular review of digital standards, as well as related targets and performance information for all services and service delivery channels in use. This includes reviewing the service with clients, partners, and stakeholders, as well as the departmental Chief Information Officer, at least once every 5 years to continuously identify opportunities for improvement.

The Digital Ambition also recommends the systematic review of services in line with the Government of Canada’s 10 Digital Standards to identify gaps and improve people’s experience.

The Digital Standards Playbook also outlines the purpose of the standards:

“Our goal is to provide public services to Canadians which are simple to use and trustworthy. The Standards form the foundation of the government’s shift to becoming more agile, open, and user-focused. They will guide teams in designing digital services in a way that best serves Canadians.”

While our digital standards and policies lay out a vision for being flexible and people-centred, the challenge for our team and many others is operationalizing these ambitions.

As a new-ish team, there was no established way for our service teams to ensure quality when building a digital product.

To start building out governance processes based on the digital standards, we looked at how it’s been done by other government organizations.

Learning from other organizations

Many other jurisdictions evaluate their services against a similar set of digital standards and have developed digital assessment processes.

This usually involves inviting a panel of expert practitioners to evaluate a product at each phase of the service design lifecycle to ensure it’s meeting the standards.

You explore the problem space in Discovery, represented as stars in the sky
You test options with hypotheses in Alpha, represented as lines connecting stars to form a constellation in the sky
You build and refine options in Beta, represented as a top view of the solar system
You continuously improve in Live, represented as a top view of the Milky Way
Visual representations of the Discovery, Alpha, Beta, and Live service design phases. Created by: Sarah Ingle

Some examples include the Ontario Digital Service’s (ODS) Digital First Assessments, and the GOV.UK’s Service Standard Assessments.

The UK’s Government Digital Service (GDS) introduced Service Standards in 2014, and Ontario introduced their Digital Service Standard in 2017.

We had informal chats with our peers at GDS and the ODS to learn more about their experiences designing and participating in assessments.

Through these conversations, we recognized a few themes:

  • Systemic barriers restricting service teams’ ability to meet the standards
  • Lack of investment on internal capacity, especially for user research and testing
  • Conflict between people, politics, or organizations leading to non-constructive criticism and or escalation

We felt, and our peers suggested, that building these processes and relationships around assessments was part of the problem. It reinforced the framing of governance solely around compliance.

Shifting the frame

Framing these digital standards evaluations as assessments reinforces some of the structural, behavioral, and cultural patterns we wanted to change — top-down, unilateral decision-making, reductive criticism, etc.

In practice, within ESDC, the GC, and other places, ensuring digital standards are met through assessments had also already led to some outcomes we wanted to avoid.

  • Teams feeling interrogated or inundated with feedback
  • Adding more compliance artifacts to teams’ workloads
  • Adding another layer of governance or more gates to pass
  • Providing one-off or end-of-phase feedback and support rather than resources and relationships for continuous improvement

It was clear that if we wanted to shift the culture and its outcomes, we needed to shift the design and framing of our processes and relationships.

In her 1989 CBC Massey Lecture The Real World of Technology, Canadian public technologist and peace activist Ursula Franklin, spoke at length about the relationship between process and culture.

Many of Franklin’s reflections and predictions hold true. Particularly related to prescriptive technologies, which organize work as a sequence of steps requiring supervision by bosses or managers.

Franklin argues that “when working within such designs, a workforce becomes acculturated into a milieu in which external control and internal compliance are seen as normal and necessary.” This leads to social environments where “eventually there is only one way of doing something.”¹

In other words, we create a culture that values compliance over creativity; one where we choose conformity to process or meeting technical requirements over meeting people’s needs.

Our governance processes at ESDC

We realized that in order to change our culture, we needed to reframe our processes away from compliance-oriented assessments and toward peer review processes that encourage collaboration, creativity, learning, and responsible design.

We started by changing our values and outlining what we did and didn’t want the process to be:

What it isn’t

  • Criticizing or poking holes in each other’s work (e.g. Yeah, but you can’t do that because X)
  • One-size fits all, fixed, compliance exercise
  • Unilateral, where assessors give directions and teams listen and implement

What it is

  • Constructive feedback that helps grow each other’s work (e.g. Yeah, and have you thought about X? How could we address that together?)
  • An ongoing process designed with product teams and adapted to meet their varying needs or contexts as they evolve
  • A reciprocal relationship where teams work with guides to identify and make changes

Then we worked on shifting our language to match. We changed:

  • Digital Standards Assessment to Digital Service Review
  • Assessors to Guides
  • Assessment process to Peer review
  • Team being assessed to Product/service team

We also started thinking more critically about the roles, responsibilities, and experience of different participants in a Digital Service Review.

Instead of reinforcing structures for one-way assessment — where experts assess work and feedback while teams listen, we would build structures for reciprocal exchange through peer review and conversations.

Practitioners would help guide service teams through reviewing their work and providing feedback rooted in the digital standards.

Service teams would receive this feedback, and also be given space to connect with guides, ask questions, and request help with specific problems or gaps they are experiencing.

Although focused around the end-of-phase reviews to start, these guide-team relationships could also grow into a community that provides ongoing feedback in support of continuous improvement throughout the service design lifecycle.

Testing a new approach: our first Alpha review

We hosted our first Digital Service Review with one of our digital service teams, the Secure Client Hub, who were nearing the end of Alpha. They built a prototype to make it easier for people to see what employment or pension benefits they are receiving.

We invited guides from the Canadian Digital Service and the Canada School of Public Service who have experience working with and applying the digital standards.

You can read the Secure Client Hub team’s Alpha Review Report on our GitHub, where we’ll share our process work and the results of future Digital Service Reviews.

What we did

We plan to invite guides to review the service at the end of each service design phase and decide whether the team is ready to proceed with the next phase of development.

We used Miro to host the first Digital Service Review and invited the Secure Client Hub team to add relevant artifacts and prototypes to the board. Guides explored these materials so they understand the context around the product being built and can start to evaluate whether the product meets the digital standards.

The team also used the board for self-reflection on the ways that the product may or may not be meeting each of the standards. We included background material on DECD and the digital standards if anyone wanted a refresher.

We drafted a Service Standards Reflection (a rubric) with criteria that the service is measured against to demonstrate if it meets each digital standard. We took inspiration from similar rubrics created by other government organizations.

The live session on June 21 included a Q&A section for guides and teams to answer questions that come up before and during the review.

The guides then met to decide on the outcome based on the team’s self-reflection and the live demo. In a regular review, the guides would recommend an overall outcome of:

  • Pass: Ready for the next phase. May include conditions or gaps to fill before the next review.
  • Pivot: Re-enter an earlier phase.
  • Pause: Consider stopping development of the service.

If the team disagreed or a decision couldn’t be made, we would do another review.

Since this was the first Review and the main goal was to learn, ask questions, and test out the process, the team would proceed to Beta Build regardless of the outcome.

Guides were impressed that the team tested with users with diverse needs and backgrounds, built partnerships across the organization, and had automated testing for issues with the system.

Some key recommendations that the Guides suggested for the team were to explore user needs related to privacy and security, how to measure the success of the service, and ways to shorten feedback loops.

Learning from our first Review

We heard generally positive feedback about the structure of the Review and lots of ways to improve.

What did the team like?

  • How the Digital Service Review was structured
  • Asking questions to the guides and getting their feedback
  • Building a network with experts
  • Friendly and inclusive tone

What did the guides like?

  • Artifacts and recorded demos provided by the team ahead of time
  • Digital standards rubrics and scales
  • Well-organized Miro board
  • Opportunity to contribute to the digital government community

What did we learn?

  • Guides want a better understanding of the product story and the discovery work so that they can more meaningfully support the team
  • The team wants their service reviewed more frequently — not at the end of a service design phase — so that they can iteratively improve their service against the digital standards
  • Structured demos so guides have enough information to accurately evaluate the product for each standard

Taking our own medicine on iteration

We’ve made improvements to the Alpha Review and will start sharing our templates when they’re ready on our GitHub. Up next, we are focusing on redesigning and integrating go-live protocols with Digital Service Reviews. The next product we are testing out the process with will be going through a Beta Build Review.

After our last blog post, on the need for agile governance, there was some conversation on Twitter about the difference between gating and continuous review.

We recognize that meeting the digital standards should be a continuous process. As much as possible, we want teams to be able to reuse artifacts and materials they’ve prepared for showcases or reporting for Digital Service Reviews. We are testing out collaborative check-ins with partners at CDS, where teams can request support at any point of the service design lifecycle.

The chapters model at DECD also gives us an opportunity to run code or design reviews with our peers internally. We expect this process to evolve and we’re working with product teams to codesign an approach that adds value and alleviates their burden.

If you’d like to chat about this, ask questions, or give feedback, we’d love to hear from you. Please get in touch with Sarah Ingle, Jane Lu, or Daphnée Nostrome.

Thanks to our peers at GDS and the ODS for sharing their experiences with service standards, as well as providing early feedback on our framing, and on this article.

¹ Ursula Franklin, The Real World of Technology (1989), CBC Massey Lectures. Page 23.

--

--

Jane Lu
Good Trouble

Research And Policy Analyst at ESDC | Alum at Ontario Digital Service.