Weeknotes: Test and learn…then improve!

David Scurr
CAST Writers
Published in
4 min readAug 8, 2022

--

Thinking time

I’ve recently emerged from a pretty intense period co-delivering the 1st edition of Deloitte Digital Connect .

With delivery complete comes the fun stuff — reviewing feedback, impact and sharing learnings with the team and partners. Allowing time for reflection (solo and in group) to ensure we don’t just dive in and repeat — bad habits can be hard to shake off unless you confront them head first! It’s an essential part of programme management and one that can often be overlooked. How many times have you moved on straight to the next project before you’ve even had the chance to catch your breath?

At CAST we’ve been thinking about a number of things to improve our ways of working including:

  • How might we capture learnings more systematically while delivering a busy programme?
  • How might we create a collaborative space to review how things are going as a delivery team, and address challenges while delivering?

Collective learning is more than just another Miro board

Lots of learning occurs all the time when delivering programmes. A participant getting stuck in a Jamboard, the delivery team being stretched, realising the cadence for workshop training is wrong — these are all useful learnings which need some attention, preferably while still fresh, before they’re long forgotten. The temptation is to park all this until the end of programme and let it all surface again in a colourful Miro board. But you’ll probably only end up scratching the surface and run the risk of missing out on some really useful detail. And have less opportunity to improve next time as a result. So here’s what we tried for this programme…

  1. We set up a Learnings Board

Purpose: Create a ‘source of truth’ for all programme learnings — any learnings shared by the team are logged on there (anyone can log learnings)

Format:

  • Simple Asana board — different columns for different programme areas (e.g. onboarding, peer support, programme management, comms)
  • Each card represents a different learning
  • Each card is ‘assigned’ by a team member so you can see who logged it

2. We ran three mini-retros to review and discuss what came up on the Learnings Board

Purpose:

  • Review things collectively while they’re still fresh in our minds (not just at the end)
  • Have dedicated time to review what’s going well and not so well
  • Have an opportunity to address issues or pivot based on the mini-retros

Format:

  • We focus on what’s happened recently or key themes that have come up in stand-ups
  • We use the Learnings Board and as a our learning source of truth — to log, to use for retros and to document discussions.
  • We review and add to the Learnings Board before we meet. This is to allow those who can’t make it to share their reflections too and to allow more time for discussion when in meeting.
  • We tag each card with a label: ‘KEEP — liked this; PAUSE — needs reviewing; START — try next time?’
  • We review the board together and upvote cards that we’d like to discuss in the retro.
  • We do rounds of discussions for the cards that got the most votes
A part of the learnings board showing 4 thematic columns, each containing different learnings with different tags (‘Keep — liked this’; Start — try next time?’; Pause — needs reviewing’). Some of the cards have been upvoted.
Part of the Learnings Board used for Deloitte Digital Connect

3. We played back a summary of the learnings at programme end

Purpose:

  • The key here is to be action-focused and highlight aspects of the programme that need improving and to start allocating some follow-up actions.
  • Determine the process to ensure recommendations can be acted on by team members processes

Format: short presentation using Google Slides shared in a team meeting

Three things that worked well and not so well

A final part of this exercise if highlighting what’s worked and how we can improve the process.

3 things that worked well:

  • Actions and learnings mentioned ‘in passing’ are not lost or forgotten — there’s a source of truth for these
  • More time to discuss and review in more detail successes/challenges
  • Easier to then feed in key learnings that need actioning either through future stand-ups or as part of the final playback (leads to some actions/solutions)

3 things that didn’t work so well:

  • Challenge to know how to group/cluster themes for each mini-retro — e.g. sequential / thematic
  • When doing the retros using Asana, less visual and interactive than Miro (upvoting functionality also limited). Miro better for this exercise.
  • Having time to then follow up on various actions that came out of retros

We’re currently refining this approach on the Sport England Innovation and Digital Accelerator, building on what’s worked well for Deloitte Digital Connect and improving those bits that didn’t quite work so well. We’ll share another update soon!

--

--

David Scurr
CAST Writers

Passionate about tech for good & community building / Programme Lead at CAST / Founder, Tech for Good Brighton / Founding Member, Tech for Good UK/ @david_scurr