I’ve learned that “best practices” usually means copying what everybody else does — Patty McCord, 8 lessons on building a company people enjoy working for, February 2019
I’m not convinced by the term ‘best practice’ to describe something, however flattering the label is. So much so, I’ve stopped using the term. Because how do we know it’s best if we don’t measure it? So when I was asked to share the design of the impact playbook as an example of best practice, it got me thinking about how to honour this request earnestly but also openly.
After some internal wrangling (perhaps I was being too pedantic refusing to bask under the umbrella of ‘best practice’?) I decided to look at the parts of the playbook that we’re really proud of and that have worked well, as well as the parts that we still need to work on. The good practice, and the bits that needed to be improved.
However, in preparation, I’ve found this to be a bit more challenging to write about than I had imagined. For one thing, it feels very personal, sharing reflections that up to now have been shared only within a relatively small group of colleagues and partners from our network. At the same time I’m preoccupied with developing its next phase (together with Sinzer we’re in the final stages of taking the work of the Europeana Impact Task Force 2018 and shaping it into Phases II & III).
I find I can’t only focus on the past, without also addressing how that experience relates to the next developments with Phase II & III that you’ll be reading about in the coming months.
So let me step back a little to set the scene, and then let’s see if I can take you on a journey touching upon particular aspects of our choices, decisions and the lessons we’ve learned along the way, giving you a taste of what is coming next.
Facilitators of digital transformation in the cultural heritage sector
At Europeana we know a thing or two about working at scale across the sector, about using models of cooperation, advocacy and consensus building to achieve that. In ten years we’ve developed partnerships with over 3,500 Cultural Heritage Institutions (CHI), worked with national and thematic aggregators (of cultural heritage data) and built a Network Association of 2,000 professionals.
Together with my colleagues, we publish 58 million digital cultural heritage objects from our data partners, each object shared using a standardised method.
Through our data model, licensing and publishing frameworks we have enabled interoperable data, where each object is labelled with standardised rights information, and through the publishing framework we advocate for higher standards of data to be developed and shared.
Behind these public facing achievements is an innovative organisation, building and testing products and services that help us keep moving towards our mission: to transform the work with culture. We’re also an open organisation — take almost any area of our work and you’ll find the story of how that evolved, the stakeholder engagement method we used and documentation of the results.
Almost every single thing that we do can be traced to a collective decision making process or cooperation with our data and network partners.
Since 2013 we’ve been working with a wide range of our partners to explore what impact means to our sector — in our role as facilitators of the digital transformation. Working from the research of Professor Simon Tanner of Kings College in London — and course with Simon himself — we adopted a broad impact framework to help shape our discussions with our partners, and throughout the sector.
Fast forward 4 years to 2017 and we’re still talking …. Despite our best efforts, the conversation had not gotten very far. We felt that we needed to shake things up. We had learned that impact was a really interesting topic, but it was too complex and distant for our partners to allocate resources to explore. And that was a problem we felt we could solve.
We did this by working together with a task force and impact experts Sinzer to operationalise the framework. Making it possible for our partners to follow our guide to talking about impact in their organisations.
In October 2017 we launched the impact playbook — a Cultural Heritage professional’s guide to exploring impact. We talked about it being like a cookbook — the kind that you can dip in and out of.
Following recipes where you needed to, reading it to inspire your next dish, or simply to start a conversation about what you want for dinner tonight.
The playbook: part of a bigger toolkit
And whilst we focused our efforts on talking about the playbook, I want to point out that it’s actually part of a bigger concept; the impact toolkit. Everyone involved in the development of the playbook knew — we all agreed on this— that the playbook would only be successful if it was supported with tools and resources that went beyond what was possible with the playbook. When we launched it, we hadn’t had the chance to properly develop the other elements in the toolkit, so we decided to focus on the playbook. And up to now, that focus has remained.
So before I start, I want to clarify what I mean when I say toolkit.
The impact toolkit is a set of resources that support the implementation of impact assessment.
Centred around the playbook as a how-to guide, the toolkit is where you find resources — slide decks, canvases, explainers videos — that help you, as a user, implement the workshops, methods and processes described in the playbook.
It’s also where you find inspiration and references, in the forms of case studies and articles written by fellow professionals in the sector, as well as by impact experts. And finally it’s where you find a community of like minded professionals all curious to explore what impact means to them, in their work and organisation.
Building a good model: Looking back at what went well with Playbook v1
I’m going to use a retrospective style process — adapted from Scrum — to share my thoughts. On what has worked well in the playbook and toolkit, what is an opportunity to improve and how these insights are reflected in the next phase of the playbook, due to be rolled out gradually over the summer.
Looking at the playbook as a whole felt too much like skimming the surface, so in each part of the retro I look at the design of the playbook, the playbook as a product itself and lastly the implementation of it.
Design/ We didn’t reinvent the wheel
It’s possible that in the early days we all thought that through the guidelines — our very first understanding of what we would build — we would develop brand new ways of working, new tools and canvases. Strongly inspired by contemporary models like the Business Model Canvas, we took a new approach (for us) to the design of a set of guidelines; it needed to be engaging, accessible and relatable.
However in the end, the playbook was formed centred around refined but established tools — the logic model, refined to become our change pathway. And some taken from completely different disciplines — the empathy map which was taken from user experience & design practices. Together, the decisions to use familiar tools helped lower the learning curve to a knowingly challenging topic.
Product/ We were really confident in every single element of the playbook
Because we tested it. Over and over again. Nothing that went into the playbook was untested and so we were really confident that the workshops and canvases worked well. This sounds like it should be obvious, but the temptation to add things that seemed like good idea at the time but were untested was great.
We had a great team supporting us along the way, and through testing our workshops with them, explaining ourselves and our techniques, we became very aware of the need to present the playbook in a way that was thoughtful of a range of learning styles. So you might find you relate more to Sanne’s experience of running an impact design workshop, to our explainer or step by step walkthrough of how to run the design workshop.
Implementation/ It sparks conversations, expected an unexpected
Our experience with co-developing frameworks and guidelines tells us that it takes time for it to be adopted. So our early target of 250 downloads seemed uncomfortably ambitious (especially when we realised we didn’t actually track downloads in the first few weeks… ooops). In practice it’s proved to be a really engaging tool to deploy — it’s a conversation starter in itself, but the workshops are popular and dynamic to both run and participate in.
In the months that followed (and some 2,000 downloads later) we’ve followed some early adopters, heard about how it is used as a provocation, set as homework for students, and my favourite anecdote of all; how it gets hacked (more on this another time).
Stories about hacking the playbook and its tools are my favourite anecdote. When we designed it we wanted you to dip in and out. Use it in ways that suit you. So it’s great to hear that some of you follow it faithfully, and some of you hack it. Great stuff!
Building a good model: Where are the opportunities to improve in the next version of the playbook
This is my favourite part of a retro — the bit where you look at what you can improve upon next time around. Where is an opportunity to just tweak or refine your work so it works that little bit better for everyone else. Or perhaps, so that it works a lot better…
Design/A sustainable way of working
During the design of the playbook, we worked through how we thought it would be used — and with the toolkit we thought about what resources were needed to really support its implementation. We’d shared slide decks and guidelines before at Europeana, but not for such a new topic, and not by using such a step-by-step approach. So we put in place a mechanism to gather feedback from users, but it only told us half the story.
What we learned through trial and error was that users needed more: channels to give feedback, ways to connect with more experienced practitioners, and more case studies. From our side, being the primary driver of this was not sustainable, so we have a real opportunity here to improve how we address these needs in a sustainable and effective way working more closely with users and practitioners.
Product/ Make it more accessible
We took an early decision to deliver the playbook electronically as a pdf book you could download from our website. Conceding that a very limited run of printed playbooks would be a useful advocacy tool. As a pdf book the playbook is great to read — and I have to admit I’m very proud to have my hands on one of those rare printed copys— but as a web resource, there is much more we can do to make it accessible.
In its current form it’s not very easy to search through — you have to know it exists and have downloaded it first. If you want to translate it — as our Polish partners did to enable them to explore it through a series of workshops— then you first need an editable copy of the file. These are three barriers to accessing and using the content of the playbook which feel too high to sustain.
Implementation/ Break down the components
The playbook is a robust product on its own, but the tools and resources within it, and the elements of the toolkit, are stories in themselves. Through a series of articles we’ve shared how users have applied the playbook as a whole, but we also hear about how the individual components are used (and of course hacked!) and that users want to know more, and learn more about these examples.
So we have learned that the components — although explained in the playbook — have an identity in themselves. We can do more to shine a light on each component, to help users understand more about them, and inspire new ways of working with them.
Building a good model: How these insights will be reflected in v2 of the playbook
Having gotten this far into my scrum-style retro, it already feels useful to have developed these insights. But it’s not quite enough to have them written down and shared, I want to go one step further and clearly identify what we’re doing with them, in the work we we deliver over the coming months.
It’s all very well identifying how you can improve through a retrospective, but being transparent in where you are making those improvements is fundamental to making the most from this process
So, now we have these insights, I want to take a moment to share how these shape our approach for the next version of the playbook & toolkit.
In the interests of being realistic, and also not changing too much at once, I’m focusing for now on what are the most significant and realistic changes that can be made.
Design/ build a sustainable test-user community
Through the Impact Community, we’re going to explore how we can work with a broad group of users to build a more open user test group that is sustainable and less reliant on a small cluster of users. With the aim that we can test and refine different elements of the playbook and toolkit, making the feedback process more transparent and easier to participate in.
Product/ make it easier to access and work with the playbook
The next version of the playbook will be released in Beta form to the impact community as a google document with comment functionality enabled. Using this approach we want to see if this helps lower the barriers we’ve identified, whilst at the same time retaining the effective design format of the original playbook. It will also be more integrated within the toolkit, which in the coming months will be redesigned from its current format.
Implementation/ provide dynamic resources through the toolkit
We have learned from implementation process so far that users feel much more confident once they have participated in a workshop. So we will explore ways to share this experience efficiently through more dynamic and user-driven resources that can be reused, updated and built upon. And we’ll do our best to respond to the most popular request we get for a webinar or two, so you can hear in person from users and practitioners about their experiences.
As part of updating how the toolkit is delivered, we will continue our recent efforts to highlight the ongoing journey of fellow impact explorers— such as Merete Sanderhoff’s journey using the impact playbook at the Statens Museum for Kunst. And reorganise the toolkit in response to the feedback we get from you, and in time the user test group.
That’s a lot right?! So what next? From here, we’ll be working on some assigning success criteria to how we know these refinements are effective, and making these more transparent by publishing them online, along with our progress. But for now, I hope you have found this retrospective an interesting insight into our work developing the impact playbook and toolkit, and a little bit about what is coming next. It’s been really interesting to write up, and I’d love to hear your feedback!
(You can check in on our progress yourself on the impact toolkit page, where you can also download the playbook, browse the resources and join our growing impact community to receive a preview of the next phase of the playbook .. as well as find out how you can contribute to the activities I have described in this article.)