Business as Unusual: How to train when expertise becomes outdated
How can we develop training programs under conditions of rapid change? In this essay I use the example of COVID-19, and lessons we have learned from the pandemic, to re-think expertise, re-think how we identify experts, and re-think training.
Expertise is becoming outdated more and more quickly. This is a growing problem. We see the loss of expertise — the rapid pace of progress that makes a great deal of prior knowledge irrelevant — in many different domains: Artificial Intelligence (AI) and Information Technology, and biotechnology. Other, slower moving domains also are also vulnerable. Child Welfare agencies are continually scrambling to keep up with changes in regulations. Law Enforcement departments are continually scrambling to keep up with new laws and restrictions. Firefighters have to keep up with new materials and construction methods. Technicians maintaining new equipment and software cannot possibly have the hours of practice needed to achieve expertise — and many sources of failure are not predictable.
The typical training approach cannot keep up with these rapid changes.
The conventional training approach is to:
o Identify the training requirements in advance
o Clarify the standards for performance
o Specify the procedures that need to be trained
These steps are out of synch when faced with rapid changes.
o Identify the training requirements in advance. But in the future, training requirements may change so rapidly that efforts in this regard will rapidly be overtaken by events. Hoffman et al. (2010) have referred to this problem as the Fundamental Disconnect: By the time you carry out these steps, the task and context and requirements have already shifted.
o Clarify the standards for performance. But in the future, standards may similarly evolve too rapidly.
o Specify the procedures that need to be trained. But for cognitively complex skills, procedural approaches will always be inadequate, regardless of how quickly the domain changes.
Actually, this traditional method was never the best way to train expertise. It was a way to bring people to basic competence, but experts have to go far beyond that.
One approach to the problem of outdated expertise is to engage in continuous learning and discovery and Just-in-Time training. However, while “continuous learning and discovery” sounds gripping and romantic and revolutionary, it isn’t realistic.
We will still need some sort of validation. Working in high-risk industries such as aviation, healthcare, petrochemical operations, law enforcement, we simply cannot have a training free-for-all. These kinds of industries have a low tolerance for errors and therefore need some scrutiny of training recommendations to filter out those that are impractical or unacceptably risky. It could be argued that Wikipedia does very well relying on community self-policing, but the Wikipedia content consists of general knowledge topics, whereas industries like aviation, nuclear power, law enforcement, require stronger curating.
At the same time, the stronger curation cannot strangle the rate of adaptation. The verification and validation process cannot continue in its current sclerotic path. For example: V&V (Verification & Validation) for Artificial Intelligence systems in the government can take years and by the time the V&V evaluation is completed, the AI system being evaluated is already outdated. This is another example of the Fundamental Disconnect mentioned earlier. So, who does the validating? The keepers of the current orthodoxy? The people currently in charge usually got to be that way because they were very good at the current way of doing things. They may not be skillful at anticipating future perturbations.
We will still need coordination. We can’t have everyone learning different lessons. Teams have to coordinate. Each team member has to predict the actions of the others. Common lessons and routines are needed for coordination. These aren’t part of expertise, but experts need to master these routines in order to coordinate and in order to judge when to depart from the standard procedures.
The idea of continuous learning is not new. The concept of expertise includes within it continuous learning and discovery as opposed to achievement of a fixed state. Many professions arrange for professional development courses. What is different here is the speed of change and the rapidity with which expertise becomes outdated — the rapidity with which professional development courses become outdated.
That is the challenge that this essay addresses. Assuming that the pace of change in many domains continues or accelerates, and that current practices are unable to keep up, what can be done?
COVID-19 as an example. This essay is not about COVID-19, but I use the pandemic as a convenient and vivid case study to explore what can happen during a period of rapid change. COVID-19 is a fairly unexpected event that rendered best practices obsolete and set off a scramble for the healthcare community to re-tool. When COVID-19 hit the U.S. in February/March 2020, Evidence-Based Medicine was irrelevant. Best practices weren’t useless as much as irrelevant — many of them were never even tried.
Expertise also seemed to become irrelevant. Who were the experts? Within the community of expertise researchers (e.g., Ericsson, 1996), the gold standard for assessing experts involves gathering data, which can take years. With COVID-19, the healthcare community had weeks, not years.
As so-called experts stumbled and reversed themselves, many people became skeptical of any advice issued by the traditional authorities and relied on rumors and on popular social media sites. For example, at one point around April 2020 I was told by a well-meaning friend that I could test myself for COVID-19 infection by seeing if I could hold my breath for at least 10 seconds. It made perfect sense — after all, COVID-19 attacked the lungs for many victims — and I performed this self-test for several days until I discovered that the test had no scientific basis. There was no evidence for its validity.
The pandemic offers us an example of what can happen when expertise becomes outdated. Of course, a deeper analysis suggests that even with COVID-19, expertise never became obsolete. We still wanted skilled medical personnel helping us. We needed experienced clinicians to make these rapid changes — to make sense of the reports and the new data. We happily relied on the expertise of the microbiologists who were able to decode the structure of the virus and to create vaccines in astonishingly fast time.
Nevertheless, COVID-19 did render a lot of clinical expertise obsolete. Healthcare professionals were unsure of what the symptoms were, the causes of these symptoms, and the types of treatments to propose.
But there were also stable features that did not change much, if at all, during the pandemic. Experts could use the diagnostic equipment: read the CT scans; establish infection control methods using PPE, positive air pressure and other resources based on prior experience with Ebola as an analog; etc. Experts were able to alleviate symptoms using ventilators — and ventilators require skill and expertise to manage.
So, what can we learn from COVID-19? Obsolescence was not total. It was only partial. Only a portion of the work required change. Therefore, continual revision is not necessary — it ignores the aspects of the work that continue.
The next sections use the COVID-19 event to suggest ways to re-think expertise, re-think how we identify experts, and re-think training.
Re-thinking expertise: The Skill Portfolio account of experts
I suggest that we abandon our current unitary concept of experts. Too often, when we think of experts we think of an undifferentiated quality — someone is an expert or isn’t. Or we postulate stages of moving from novice to expert. That’s not good enough.
In contrast to a unitary concept of experts, a Skill Portfolio account identifies separate skills that experts possess. Experts blend these skills as needed, but the skills themselves are fairly independent. Not all experts will have each of these skills or need them. The skills describe what experts can do and what they know, as opposed to the outcomes of applying the skills.
Kahneman & Klein (2009) discussed the concept of fractionated expertise, drawing on previous work by James Shanteau. “For example, auditors who have expertise in “hard” data such as accounts receivable may do much less well with “soft” data such as indications of fraud.” (p. 522). In this essay, I am expanding on this idea of fractionated expertise.
We can distinguish five general types of skills that experts may have: Perceptual-motor skills, Conceptual skills, Management skills, Communication skills, and Adaptation skills. Note that these are not components of expertise. Some skills may be relevant in one domain but not another. And they are reasonably independent. That is why this is a differentiated account of expertise as opposed to a unitary account.
First, Perceptual-motor skills. Some aspects of perceptual-motor skill constitute tacit knowledge: pattern recognition, perceptual discrimination, motor skills and the use of tools. Think of the way dentists use mirrors to repair cavities. A dentist can drill skillfully because the perceptual motor skills of mirror handling have been highly automated.
Second, Conceptual skills. These include our mental models. For example, the dentist has a conceptual model of how the various materials in a tooth (enamel, dentine, prior filling materials, nerve roots) behave during the drilling process, along with a mental model of how teeth must be configured to be successfully filled with the epoxy or other materials. Our mental models enable us to see the big picture in a situation, and to diagnose the causes of problems, and to anticipate future states. Experts also have mastered the Standard Operating Procedures and Best Practices, but their mental models are rich enough to indicate when the SOPs and Best Practices need to be modified or abandoned. Borders, Klein & Besuijen (2019) have described a Mental Model Matrix that goes beyond a representation of how things work and includes limitations and flaws, as well as workarounds.
Third, Management skills. Continuing the dental example, the dentist knows how to manage the assistant and the patient during the process, including knowing how the assistant’s training will shape his/her behavior. Each patient will behave differently, as will different assistants, but the dentist has the management expertise to make the situation graceful regardless of emergent issues like patient anxiety, training gaps in the assistant, sudden need for the assistant to work successfully with more PPE, etc. With COVID-19, prominent examples were skills at managing time and effort and team coordination. The management skills of experts included a conceptual understanding of how their organizations worked — this was very important for clinicians in hospitals during the worst phases of the pandemic when each hospital needed to quickly establish new procedures and these procedures had to be consistent with the hospital’s existing structure. Experts also needed information management skills. There was so little information about the disease but there were many different channels offering new ideas. The medical community rapidly converged on a few trusted sources such as EM:RAP (Emergency Medicine Reviews and Perspectives) for getting the latest news, yet healthcare providers were still getting important information about symptoms of COVID-19 from the media.
Fourth, Communication skills. Dentists and their assistants have developed carefully choreographed routines for managing routine procedures and can adapt these routines when they encounter new situations. Communication skills are really tested when the routines break down, like encountering a novel configuration during a root canal. The team members have to explain things to each other and direct each other efficiently and unambiguously. With COVID-19, the staff members could no longer depend on their standard communication procedures and risked Common Ground breakdowns if the communications were unclear or ambiguous.
Fifth, Adaptation skills. Ward et al. (2018) have asserted that the essence of expertise is the ability to adapt. This concept of Adaptive Expertise, as first described by Hatano & Inagaki (1984, 1986), is that experts are faster to adapt to changing conditions (such as COVID-19) than non-experts. Ward et al. provide evidence that adaptation can be improved by training and offer recommendations for such training. Gorman et al. (2010) have described a perturbation training method for improving performance at the team level. The concept of Adaptive Expertise shows why expertise does not become obsolete — just the opposite, it becomes the basis for improvising and making discoveries.
These five general skills are identified on several criteria. First, they are acquired through experience and feedback, as opposed to being natural talents. Second, they are relevant to the tasks people perform, and therefore the set of skills will vary by task and domain. Third, superior performance on these skills should differentiate experts from journeymen. The skills will vary for different domains and tasks. Our focus should be on the most important sub-skills for that domain. Otherwise, it is too easy to have an ever-expanding set of skills to contend with. And in some domains, one or more of these general skills may not apply at all. As Kahneman and Klein, and Shanteau, note, people may be experts in some aspects of a task but not others.
There will undoubtedly be disagreements about which skills to include in a Skill Portfolio account, what level to use in describing a skill, and how distinct some of the skills are from others. We are not dealing with a Table of Elements in chemistry. I hope that these debates will at least be informative.
Further, we can distinguish the skills that are reasonably enduring and those that need to change quickly, as in a next pandemic. Let’s illustrate this distinction in the context of COVID-19.
1. Expert skills that may become OBE (Overtaken by Events)
2. Expert skills that are more enduring
a. General mental models — for example, of diseases, of injuries, of physiology, of the way research is conducted and data are collected.
b. Corporate memory — what’s been tried in the past and how it worked out
c. Perceptual skills. How to read a CT scan or EKG, how to assess a patient’s breathing. Does a patient look sick? Assessing a patient’s cognitive abilities.
d. Psycho-motor skills. E.g., a surgeon being able to rapidly tie tiny sutures. Or a dentist’s use of a mirror to fill a cavity.
e. Organizational dynamics. The workings of a hospital, a ward, nearby specialty clinics. How to make things happen.
f. Communication skills. Reading other people. Explaining things to others and listening to what others, including patients, have to say.
g. Adaptation skills. Seeing the potential implications of new findings about diagnosis and treatment.
What is important about the Skill Portfolio account of expertise is not the specific skills and sub-skills, but the concept of expertise as an assemblage of different skills that are reasonably independent. Some experts will have more or less of each of these skills. Experts who may have superior perceptual-motor skills may lack (or not even need) communication skills or management skills.
The Skill Portfolio account illustrates how shallow it is to claim that new developments will make expertise obsolete. Such claims rely on the unitary model of experts instead of a differentiated model.
What is becoming obsolete is the description of expertise that is grounded in the study of domains that are relatively stable. We need new methods of training and knowledge sharing, and we need them to be institutionalized.
Re-thinking training. Just-in-Time (JIT) Training.
Here is what didn’t happen with COVID. No one compiled a COVID curriculum. No one established training programs. No one tried to differentiate training modules for different situations and contexts. There were no train-the-trainer courses. There still aren’t, as far as I can tell. It would be a massive job. And I don’t think it is necessary.
Here is what did happen: Lessons were swapped in a semi-informal way, just in time.
The use of ventilators was tricky. So, people, on their own initiative, made YouTube videos on how to use ventilators, and disseminated the videos freely, online.
New symptoms of COVID kept being uncovered and were described on web sites.
The web sites were semi-curated. They were by invitation only and were for the healthcare community. I think this started with a few web sites, but quickly converged to one dominant medical web site, EM:RAP, (covid.emrap.org), that clinicians trusted. Posts were not anonymous, so reputations were on the line.
Some clinicians monitored this web site and others all the time (e.g., each night when they got home from work). Others didn’t and relied on colleagues to alert them to significant developments. It wasn’t perfect, but it was fast.
Hospitals could have gotten ahead of the curve by appointing stewards to keep track of new developments and send the relevant ones to the staff members. Some hospitals did this well; others didn’t even try — they were not adaptive/resilient regarding the rapid learning curve being traced. They weren’t anticipating how they really should have been doing this. Maybe hospitals will do better with the next pandemic. I think that is unlikely.
What seems to have worked for managing COVID is a collective intelligence structure (see https://www.linkedin.com/pulse/tackling-covid-19-augmented-collective-intelligence-gianni-giacomelli/). Training the individual actors in a collective intelligence may be more effectively accomplished using the Just-in-Time model, along with the Skill Portfolio approach, than using traditional training models.
The bottom line is that no training courses were stood up, but the community rallied. Individuals, on their own initiative, prepared and distributed educational materials. So training was happening on a very large scale but without oversight or management. The training was in response to needs and taking advantage of recent discoveries (such as the observation about the loss of smell and taste). This Just-in-Time approach to training is very different from traditional training development; the traditionalists might not even call it training. And it had inefficiencies. But Just-in-Time training seems far more efficient than if a national training center was stood up to provide what was needed.
Just-in-Time training illustrates, at a social level, the type of dexterity that Bernstein (1947?/1996) has described. (See also Klein & Pierce, 2001, on Adaptive Teams, based on Bernstein’s ideas.)
Just-in-Time training has been around for a while. It is not a new idea. It stands as a contrast to deliberately establishing training objectives and programs. The COVID-19 experience demonstrates the speed with which Just-in-Time training can be mustered, and it suggests opportunities for improving Just-in-Time training for the future. Just-in-Time training complements Adaptive Expertise, operating at an organizational level rather than an individual level.
Back to the issue of handling training requirements when expertise becomes outdated.
The challenge of developing skills while the pace of change keeps accelerating suggests some re-conceptualizations: We can re-think our very understanding of experts, to the Skill Portfolio account instead of a unitary account. And we can re-think our concept of training, looking for ways to improve Just-in-Time training capabilities.
The Skill Portfolio account of expertise, and the greater use of Just-in-Time training, suggest some changes in practice.
Urgency. COVID-speed requires very rapid adaption. When you are dealing with COVID-speed, you need to quickly ramp up resilience tactics to muster the organizational resources.
Organizational resources for rapid learning. For COVID, the organizational resources were variable, with most hospitals doing a mediocre job because they weren’t prepared for a rapid learning cycle.
Need for anticipatory thinking, so organizations can rapidly adapt. What does it seem to be facing? If it is an epidemic like COVID or AIDS, the organization needs to quickly depart from normal operations.
Administrator support. You need the top decision makers in the organization to call out the shift from normal to emergency conditions. It is no longer business as usual. Instead, you are in a new regime: Business as unusual. The administrators should announce to the organization that it has entered a new space — and the nature of the organization may pivot to an emergent new form. Of course, the lead administrators have to judge when the dynamics have shifted and will continue to shift sufficiently to warrant these types of disruptions.
Pressure on experts. Under COVID-speed, organizations might try to place greater pressure on the official experts to adapt their mental models instead of resting on eroding laurels. (The actual experts will have taken the initiative of making these adaptations.) This is easy to say, but the reality is that many in authority may press back against those questioning their credibility. And we should not underestimate the effort needed to share information within and across industries and to establish responsibilities for different subtasks such as monitoring information channels and synthesizing new approaches.
Instructional materials. With COVID-speed you won’t have the time to prepare new materials. Instead, you can use easily re-configurable exercises and experiences if you have developed them in advance for other purposes. Examples of such exercises are Virtual Worlds, the Tactical Decision Games in use by the US Marine Corps, and various types of design exercises as a means for exploration and a means for quickly modifying training exercises to reflect newly discovered problems.
For purposes of illustration, consider the ShadowBox approach to building expertise. ShadowBox is a means of helping people see the world through the eyes of experts — skilled practitioners — without requiring expert presence during training. The online version doesn’t even require facilitators. One form of ShadowBox is the scenario version in which trainees work through a challenging scenario and at pre-defined decision points are queried about their rankings of different options, different goals, different types of information to acquire, and so forth. The trainees also report the rationale for their choices. Then they see the responses of a panel of skilled practitioners who have gone through the same scenario. The trainees want their rankings to match those of the skilled practitioners, but the real learning comes from comparing the trainee’s rationale statement to that of the skilled practitioners, to discover the inferences the (relative) experts were able to make given the same scenario description.
For COVID-19, this scenario version of ShadowBox might have been useful in responding to the need to modify procedures and best practices, to build flexibility in taking histories and sharing information, and to revise task management routines. For modifying mental models, you could use a ShadowBox Lite approach that relies on assertions rather than scenarios. And for helping people rapidly learn the use of tools, such as physical tools and IT tools, you might use a third version of ShadowBox, the Cue-Detect version that presents photographs and videos and contrasts the cues a trainee notices to those flagged by skilled practitioners.
Conclusion. This essay addressed the general challenge of preparing a workforce for rapidly changing conditions, using the COVID-19 experience as an analog. The essay suggests some ways to re-conceptualize expertise, using a Skill Portfolio account of expects, and a greater reliance on Just-in-Time training. As the pandemic illustrated, change was not instantaneous, but it happened much more rapidly than anticipated. Expertise is not going to become obsolete — just the contrary. The viewpoint of Adaptive Expertise is that the defining feature of experts is their ability to adapt more rapidly and effectively than others. From this perspective, we can see some ways to do a better job of preparing for the turbulence of the future.
Acknowledgements:
I want to thank my friends and colleagues for their helpful comments and suggestions on a draft of this essay: Joseph Borders, Robert Hoffman, Reza Jalaiean, Devorah Klein, Laura Militello, John Schmitt, and Adam Zaremsky. I also appreciate the guidance from this series editors: Alan Lesgold and Lia DiBello
References:
Bernstein, N. A. (1996). On dexterity and its development. In M. L. Latash & M. T. Turvey (Eds.), Dexterity and its development. Mahwah, NJ: Lawrence Erlbaum Associates.
Borders, J., Klein, G., & Besuijen, R. (2019). An operational account of mental models: A pilot study. International Conference on Naturalistic Decision Making, San Francisco, CA.
Ericsson, K.A. (1996). The road to excellence: The acquisition of expert performance in the arts and sciences, sports and games. Psychology Press.
Gorman, J.C., Cooke, N., & Amazeen, P.G. (2010). Training adaptive teams. Human Factors, 52(2), 295–307.
Hatano, G., & Inagaki, K. (1984). Two courses of expertise. Research and Clinical Center for Child Development Annual Report, 6, 27–36. http://hdl.handle.net/2115/25206
Hatano, G., & Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, H. Azuma, and K. Hakuta (Eds.), Child development and education in Japan (pp. 262–272). New York: W. H. Freeman.
Hoffman, R.R., Hancock, P.A., and Bradshaw, J.M. (2010, November/December). Metrics, metrics, metrics, Part 2: Universal Metrics? IEEE Intelligent Systems, pp. 93–97.
Kahneman, D. & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64, 515–526.
Klein, G., & Pierce, L. (2001). Adaptive teams (No. ADA467743). Klein Associates, Inc. Fairborn, OH.
Ward, P, Gore, J, Hutton, R, Conway, G & Robert, H 2018, ‘Adaptive Skill as The Conditio Sine Qua Non of Expertise’, Journal of Applied Research in Memory and Cognition, vol. 7, no. 1, pp. 35–50. https://doi.org/10.1016/j.jarmac.2018.01.009.