Hierarchies of knowledge: why we don’t put what we learn into practice

Dyfrig Williams
Doing better things
7 min readSep 12, 2017

I dread to think of the amount that organisations spend on learning and development. But how much of that learning actually gets transferred into practice?

I’ve been exposed to a lot of academic literature in my new job that’s made me look at my experiences anew. My background in citizen engagement has meant that I’ve long been an advocate of involving people in the planning and delivery of services. I’ve witnessed firsthand what happens when citizens are seen as passive recipients of public service — they end up as voiceless consumers of one-size fits all public services.

However, I’d never considered how this hierarchical model is applied to public services themselves when it comes to learning and development. But the same model applies — training is delivered by experts in their field, and attendees are expected to soak up that knowledge, go forth and make it happen!

So when it comes to putting complex information into practice, such as research into public service provision, that type of model won’t really work. Dez Holmes introduced me to The Fallacy of the Pipeline, which looks at why so little knowledge from research ends up being implemented on the frontline.

Hierarchy

Both my posts since starting to work for Research in Practice and Research in Practice for Adults have touched on hierarchy. Whilst these were about formal networks within organisations, The Fallacy of the Pipeline clearly shows the effect of hierarchies of perceived expertise between organisations.

“The blame for gaps between science and practice falls variously on the stubbornness of the practitioners insisting on doing it their way, their hubris in believing they know their patients best and the smugness of scientists believing that if they publish it, practitioners will use it. None of these characterizations of the culprits is entirely fair, but they and yet others share in the blame.”

This attitude also echoes how public services perceive people and communities. I’ve heard so many organisations talk about how ‘communities need to realise that there isn’t enough money to go around so they’re going to have to do more for themselves’. The problem is that often organisations want to have their cake and eat it. They want to retain power and control whilst communities deliver services for them. This seems at its clearest when co-production is misappropriated — communities are given the organisation’s problems to solve.

So how do we move beyond hierarchical structures? There are lots of useful examples of organisations who have given up hierarchical power in this post by Paul TaylorZappos’ use of holocracy is particularly fascinating. Surely public services should work as part of networks within communities, rather than ominously looking down upon them. This Harvard Business Review post is a good look at what a networked organisation looks like, and I couldn’t help but apply some of this thinking to how we might work with communities:

“I am talking about a whole new system that is much bigger, more powerful and involves far more people. Over the past few years, I’ve started to see many high-performing organizations use a network-like structure outside their traditional hierarchy to lead change, and with great results.”

Esko Reinikainen introduced me to Amy C. Edmondson’s concept of Teaming at the Housing Festival (which I blogged about for the Wales Audit Office’s Good Practice Exchange), which is about identifying people to work flexibly with in order to manage complicated interdependencies. One of the things that I really like is that Edmondson has empathy at the heart of her methodology, whereas hierarchy distances people from the consequences of their decisions.

Failure

Teaming isn’t the only useful model that Edmondson has devised. I came across the Spectrum of Blameworthy Failure in my previous job, and it’s particularly useful in the context of this paper.

“The attrition of some 17% of original research that never gets submitted, usually because the investigator assumed negative results were unpublishable, is particularly disturbing from the standpoint of what practitioners might consider most helpful in their attempts to adapt guidelines for patient or community interventions to their practice circumstances. Negative results of interventions are of interest because they often tell the practitioner about the intervention’s misfit with patients or conditions other than those in which the original research leading to guidelines was conducted.”

By failing to share failure, we’re actually doomed to repeat those mistakes. In my first few days in my new job, I was told that it was important to share mistakes and to avoid green shifting. I’m fortunate enough to be working in an organisation that is open to the learning that comes from failure, but if you’re unlucky enough to be working in an organisation that casts blame for mistakes that can’t be avoided, you’re unlikely to be able to take the risks that are required in order to innovate.

Qualitative data

The next step of the pipeline model strikes a chord with me in terms of how difficult public services make it for the public to communicate with them. It mirrors the pipeline fallacy, as we constrict the flow of information to organisations.

“The studies that would fail to survive this leg of the journey will increase because randomized methods are more likely to face ethical and logistical challenges.”

This post by Dan Slee neatly encapsulates a lot of the issues, as organisations ignore useful learning because it’s not in the format that they want it. That a council refuses to accept comments on a consultation via social media is a textbook case of how public services can work to their own requirements instead of what works for communities.

The ‘empty vessel’ fallacy of pushing information to the practitioner

“The recipient is full of prior knowledge, attitudes, beliefs, values and, above all, contextual constraints at any given point in practice time. Each of these influences the practitioner’s receptivity to new guidelines, their perception of the guidelines’ utility and their eventual use of them.”

We’re not blank slates. The learning that we take on board is massively reliant on our beliefs and experiences. Confirmation Bias means that we’re drawn to ideas that confirm our existing beliefs, and because of Projection Bias we tend to falsely project current preferences onto a future events.

It’s nearly impossible to remember every bias, but fortunately Helen Reynolds and Ben Proctor introduced me to the Cognitive Bias Cheat Sheet when I took part in the Natteron Podcast. This is a really useful document that can help you to double check your assumptions and the motivations behind your actions.

Participatory research, practice-based research networks and continuous quality improvement

“The most promising lines of remedy have been in bringing the research (or even better, producing the research) closer to the actual circumstances of practice….. The promise inherent in these is that the research results are made more relevant, more actionable, more tailored, more particular to their patients or populations and to their circumstances of practice and with more immediate feedback to the practitioners themselves. The promise of this ‘pull’ approach has led to the suggestion that if we want more evidence-based practice, we need more practice-based evidence.”

This really shows the value of being a part of, or at the very least close to the change. When I shared Professor Roz Searle’s work on trust at Bara Brith Camp, the part that really hit home was that larger organisations are likely to have lower levels of trust. This is because increased levels of hierarchy mean that the positive actions of those who are at the top of the organisation are diluted. And to me this is why we keep failing to implement positive change within communities — we’re too remote, and the public often has so little trust in us. Just to prove that this isn’t pure conjecture, the 2017 Edelman Trust Barometer reveals the largest-ever drop in trust across government and non-governmental organisations.

Incentives and penalties to create practitioner pull

“What practitioners in clinical, community and policy making roles crave, it appears, is more evidence from practices or populations like their own, more evidence based in real time, real jurisdictions, typical patients, without all the screening and control and with staff like their own. The ideal setting in which to conduct such studies would be their own, which takes us back to the participatory research strategy.”

I’m really excited about Research in Practice’s role here, as we provide a link between academic research and public services. We can work as a conduit to help apply research in service provision.

I also love the idea of providing evidence in real time. The paper notes that “an alarming and frequently quoted statement about the total attrition in the funnel and the lapse between research and practice is that ‘It takes 17 years to turn 14 per cent of original research to the benefit of patient care.’” This seems absurd to me in the digital world that we live in. Most of this is of course practice related, but there are also problematic issues around data sharing. Research is so often published behind paywalls and controlled by funders. But are we doing as much as we could? In the Wales Audit Office we released our first dataset with an open government licence and used Power BI to display data in real time. There are some good materials online that could be applied to research, including resources from Creative Commons on Open Science, and a great piece by the Open Access Scholarly Publishers Association on the benefits of Creative Commons.

Getting more attention to external validity in the peer review and editorial policies of journals

“One might reasonably conclude that science will always have a gap to bridge to reach practice as long as it is generated in academic circumstances that put such a high premium on scientific control for internal validity that it squeezes out the needed attention to external validity.”

This links back to my previous post on complexity, in that public services aren’t delivered in lab conditions. Instead of pretending that we can apply research in a blanket fashion in idealised and controlled environments, we need to embrace the complexity of the environments that we’re working in and think critically about how that knowledge can be applied. To me, that can only happen if public service staff and communities are given the power and the opportunity to analyse their environment. This means that we all have to let go of the reins. Are we brave enough to do that?

--

--

Dyfrig Williams
Doing better things

Cymraeg! Music fan. Cyclist. Scarlet. Work for @researchip. Views mine / Barn fi.