Avoiding Diagnostic Error in the ED

Rick Bukata, MD
16 min readApr 1, 2016

--

By Diane Birnbaumer, MD

Diane Birnbaumer, MD

While to err is human, the toll errors make when delivering medical care can be profound. Clearly, patient morbidity and mortality are at the acme of concern, but the trickle down encompasses medicolegal concerns and the personal toll on the health care worker when mistakes are made. This chapter details the scope of the problem, explains some of the more common errors and poses some potential solutions.

QUESTION: WHAT IS THE SCOPE OF MEDICAL ERROR IN MEDICINE AND, SPECIFICALLY, IN THE EMERGENCY DEPARTMENT?
According to the Harvard Medical Practice Study in 1990, medical errors affect about 3.6% of hospital admissions and 13% of these errors results in death. Doing the math, that translates into 98,000 deaths annually in the U.S. due to medical error.

While these numbers are somewhat appalling at face value, research demonstrates that medical error is rarely due to a single event but rather is due to a series of problems that often reflect issues with the system of health care delivery. Shifting focus from a “blame society” to one of patient safety shifts focus from blame to improvement. These tenets of patient safety assume competence and good intentions and deemphasize personal blame. The goal is to improve the system and identify the pitfalls and system “holes” that cause medical error. In fact, while incompetence exists, it accounts for a tiny amount of medical error and well over 95% of medical error results from people with good intentions who make simple mistakes.

How much medical error occurs in the emergency setting? The good news — Only 1.7% of adverse events occur in our environment. The bad news — Eighty-eight percent were diagnostic errors, and when placed in the medicolegal environment 94% of them were classified as negligent. So, if we know that so many of our errors are “diagnostic errors,” is there a way to identify them BEFORE committing them and improve our practice? The answer is yes, and let’s start with how we think.

QUESTION: HOW DO MEDICAL PRACTITIONERS THINK?
Recent books including Jerome Groopman’s How Doctors Think and Daniel Kahneman’s Thinking Fast and Slow detail how the human mind makes decisions. Ninety-five percent of thinking is what is called Type I or “Fast” thinking. This type of thinking is fast, intuitive, efficient and mentally easy. While this sounds good, particularly in the emergency setting, it also more vulnerable to thought error as it is founded on mental short-cuts and pattern recognition, and requires experience. It is this type of thinking that leads to “doorway diagnoses” that, while often correct, may overlook subtle clues that reveal the true diagnosis in any individual patient. As an example of how Type 1 thinking can fail, can you tell which of the following lines is longest?

Type 1 thinking would call the middle line longest, when, in reality, all the lines are the same length.

Type 2 thinking, on the other hand, is slow and mentally taxing. It requires focus and is exhaustive in nature. It uses greater resources, requires concentration and involves looking for the optimal strategy using a decision analytic approach. Overall, Type 2 thinking is more reliable and less prone to error.

Clearly, the environment of the emergency department lends itself to Type 1 thinking, but it is Type 2 thinking that may be necessary in high-risk or difficult cases. Great thinkers can toggle back and forth between the two types of thinking, using each optimally, but even those of us who are not “great thinkers” can learn skills that help minimize mistakes.

QUESTION: WHAT ARE THE GENERAL AREAS OF COGNITIVE ERRORS THAT OCCUR IN THE EMERGENCY DEPARTMENT?
Most medical errors do not occur because of a lack of knowledge, but rather they are due to what are called “cognitive errors.” Being aware of common types of cognitive errors can help clinicians recognize and avoid them. Cognitive errors are unconscious mistakes that fall into several broad classifications, and errors in these areas lead to over- and undertesting, delayed diagnoses and missed diagnoses. In general, the categories of error fall into two major areas: 1) faulty assessment of pre-test probability (over- or underestimating the likelihood of a disease), and 2) failure to consider all relevant possibilities.

QUESTION: WHAT ARE THE TYPES OF COGNITIVE ERRORS IN MEDICAL DECISION MAKING?
While the list of types of cognitive errors is long, many apply to the practice of emergency medicine.

Attribution errors involve negative stereotypes that lead clinicians to ignore or minimize the possibility of serious disease. For example, clinicians might assume that an unconscious patient with an odor of alcohol is “just another drunk” and miss hypoglycemia or intracranial injury, or they might assume that a known drug abuser with back pain is simply seeking drugs and miss an epidural abscess caused by use of dirty needles. Psychiatric patients who develop a physical disorder are particularly likely to be subject to attribution errors because not only may they be subject to negative stereotyping but they often describe their symptoms in unclear, inconsistent, or confusing ways, leading unwary clinicians to assume their complaints are of mental origin.

Anchoring: This is the tendency to perceptually lock onto salient features in the patient’s initial presentation too early in the diagnostic process, and failing to adjust this initial impression in the light of later information. This error may be severely compounded by confirmation bias. It occurs when clinicians steadfastly cling to an initial impression even as conflicting and contradictory data accumulate. For example, a working diagnosis of acute pancreatitis is quite reasonable in a 60-yr-old man who has epigastric pain and nausea, who is sitting forward clutching his abdomen, and who has a history of several bouts of alcoholic pancreatitis that he states have felt similar to what he is currently feeling. However, if the patient states that he has had no alcohol in many years and has normal blood levels of pancreatic enzymes, clinicians who simply dismiss or excuse (e.g., the patient is lying, his pancreas is burned out, the laboratory made a mistake) these conflicting data are committing an anchoring error. Clinicians should regard conflicting data as evidence of the need to continue to seek the true diagnosis (acute MI) rather than as anomalies to be disregarded. There may be no supporting evidence (i.e., for the misdiagnosis) in some cases in which anchoring errors are committed.

Affective error involves avoiding unpleasant but necessary tests or examinations because of fondness or sympathy for the patient (e.g., avoiding a pelvic examination on a modest patient or blood cultures on a seriously ill patient who has poor veins).

Availability: The disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience with a disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time (is less available), it may be underdiagnosed. For example, a clinician who recently missed the diagnosis of pulmonary embolism in a healthy young woman who had vague chest discomfort but no other findings or apparent risk factors might then overestimate the risk in similar patients and become more likely to do chest CT angiography for similar patients despite the very small probability of disease. Experience can also lead to underestimation. For example, a junior resident who has seen only a few patients with chest pain, all of whom turned out to have benign causes, may begin to do cursory evaluations of that complaint even among populations in which disease prevalence is high.

Cognitive Dissonance: When experiences contradict existing attitudes, feelings, or knowledge, mental distress is produced. People tend to alleviate this discord by reinterpreting (distorting) the offending information. If no relief occurs after committing time, money and “face” to a course of treatment, internal disharmony can result. Rather than admit to themselves or to others that their efforts have been a waste, many people find some redeeming value in the treatment.

Confirmation Bias: Another common reason for our impressions and memories to inaccurately represent reality. Practitioners and their clients are prone to misinterpret cues and remember things as they wish they had happened. They may be selective in what they recall, overestimating their apparent successes while ignoring, downplaying or explaining away their failures. Or they may notice the signs consistent with their favored diagnosis and ignore or downplay aspects of the case inconsistent with this. Clinicians selectively accept clinical data that support a desired hypothesis and ignore data that do not (cherry-picking). Confirmation bias often compounds an anchoring error when the clinician uses confirmatory data to support the anchored hypothesis even when clearly contradictory evidence is also available. For example, a clinician may steadfastly cling to patient history elements suggesting acute coronary syndrome (ACS) to confirm the original suspicion of ACS even when serial ECGs and cardiac enzymes are normal.

Commission Bias: This results from the obligation toward beneficence, in that harm to the patient can only be prevented by active intervention. It is the tendency toward action rather than inaction. It is more likely in over-confident veterinarians. Commission bias is less common than omission bias.

Diagnosis Momentum: Once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries, what might have started as a possibility gathers increasing momentum until it becomes definite, and all other possibilities are excluded.

Feedback Sanction: Making a diagnostic error may carry no immediate consequences, as considerable time may elapse before the error is discovered, if ever, or poor system feedback processes prevent important information on decisions getting back to the decision maker.

Framing: Framing is drawing different conclusions from the same information, depending on how or by whom that information is presented. For instance, you may decide that a taciturn 50-year-old man with substernal chest pain has ACS, but that same information from a tearful 45-year-old female may be interpreted as anxiety.

Gambler’s Fallacy: Attributed to gamblers, this fallacy is the belief that if a coin is tossed ten times and is heads each time, the 11th toss has a greater chance of being tails (even though a fair coin has no memory). An example would be a physician who sees a series of patients with dyspnea, diagnoses all of them with CHF, and assumes the sequence will not continue. Thus, the pretest probability that a patient will have a particular diagnosis might be influenced by preceding but independent events.

Hindsight Bias: Knowing the outcome may profoundly influence the perception of past events and prevent a realistic appraisal of what actually occurred. In the context of diagnostic error, it may compromise learning through either an underestimation (illusion of failure) or overestimation (illusion of control) of the decision maker’s abilities.

Omission Bias: The tendency toward inaction and rooted in the principle of non-maleficence. In hindsight, events that have occurred through the natural progression of a disease are more acceptable than those that may be attributed directly to the action of the physician. The bias may be sustained by the reinforcement often associated with not doing anything, but it may prove disastrous.

Overconfidence Bias: A universal tendency to believe we know more than we do. Overconfidence reflects a tendency to act on incomplete information, intuitions or hunches. Too much faith is placed in opinion instead of carefully gathered evidence. The bias may be augmented by both anchoring and availability, and catastrophic outcomes may result when there is a prevailing commission bias.

Posterior Probability Error: Occurs when a physician’s estimate for the likelihood of disease is unduly influenced by what has gone on before for a particular patient. It is the opposite of the gambler’s fallacy in that the doctor is gambling on the sequence continuing.

Premature Closure: A powerful error, accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision-making process, accepting a diagnosis before it has been fully verified. Clinicians make a quick diagnosis (often based on pattern recognition), fail to consider other possible diagnoses, and stop collecting data (jump to conclusions); often, even the suspected diagnosis is not confirmed by appropriate testing. Premature closure errors may occur in any case but are particularly common when patients seem to be having an exacerbation of a known disorder — e.g., if a woman with a long history of migraine presents with a severe headache (and actually has a new subarachnoid hemorrhage), the headache may be mistakenly assumed to be another attack of migraine. A variation of premature closure occurs when subsequent clinicians (e.g., consultants on a complicated case) unquestioningly accept a previous working diagnosis without independently collecting and reviewing relevant data. The consequences of the bias are reflected in the maxim: ‘‘When the diagnosis is made, the thinking stops.’’

Representation error occurs when clinicians judge the probability of disease based on how closely the patient’s findings fit classic manifestations of a disease without taking into account disease prevalence. For example, although several hours of vague chest discomfort in a thin, athletic, healthy-appearing 60-yr-old man who has no known medical problems and who now looks and feels well does not match the typical profile of an MI, it would be unwise to dismiss that possibility because MI is common among men of that age and has highly variable manifestations. Conversely, a 20-yr-old healthy man with sudden onset of severe, sharp chest pain and back pain may be suspected of having a dissecting thoracic aortic aneurysm because those clinical features are common in aortic dissection. The cognitive error is not taking into account the fact that aortic dissections are exceptionally rare in a 20-yr-old, otherwise healthy patient; that disorder can be dismissed out of hand and other, more likely causes (e.g., pneumothorax, pleuritis) should be considered. Representation error is also involved when clinicians fail to recognize that positive test results in a population in which the tested disease is rare are more likely to be false-positive than true-positive.

Search Satisfying: Reflects the universal tendency to call off a search once something is found. Comorbidities, second foreign bodies, other fractures, and coingestants in poisoning may all be missed. Also, if the search yields nothing, diagnosticians should satisfy themselves that they have been looking in the right place.

Yin-Yang Out: When patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have been worked up the Yin-Yang. The Yin-Yang Out is the tendency to believe that nothing further can be done to throw light on the dark place where, and if, any definitive diagnosis resides for the patient, i.e., the physician is let out of further diagnostic effort. This may prove ultimately to be true, but to adopt the strategy at the outset is fraught with the chance of a variety of errors.

QUESTION: IS THERE A WAY TO MINIMIZE COGNITIVE ERRORS?
A basic tenet in medicine is that things change, so crucial to minimizing medical errors is an ongoing need to continue to acquire new knowledge, to learn better techniques and to practice in as safe an environment as possible. Beyond that, and arguably more important, is to minimize making a cognitive error. To that end, you need to develop strategies to help you THINK better. In order to do that, it is crucial to accept certain truths about yourself.

First, you must be aware that you are vulnerable to bias. Once you accept that fact, then you can develop techniques to detect these biases and the situations in which they might arise, and learn techniques to either minimize them or their effect.

There are some specific strategies that can help to minimize cognitive errors. Typically, after history and physical examination are done, clinicians often form a working diagnosis based on heuristics. At this point, it is relatively easy to insert a formal pause for reflection, asking several questions:

If it is not the working diagnosis, what else could it be?
What are the most dangerous things it could be?
Is there any evidence that is at odds with the working diagnosis?

These questions can help expand the differential diagnosis to include things that may have been left out because of cognitive errors and thus trigger clinicians to obtain further necessary information.

In an article published in Emergency Physicians Monthly, Dr. Robert Wears proposes a more formalized approach that integrates both Type 1 and Type 2 thinking. This system requires allowing Type 1 thinking to predominate when appropriate, but assuring that time is allowed for pausing and reflection on cases that are out of the norm or are high risk cases.

After initial evaluation of the patient, Dr. Wears suggests using a series of three “Stops.”

FIRST STOP. The first phase of evaluating a patient focuses on maintaining diagnostic efficiency but assuring that “outliers” are noted and not overlooked. Symptoms that do not fit are sought after, noted and deeply considered. After the initial evaluation, the patient should be categorized as one of the following:

Straightforward — Few possibilities, all of which are low risk, and the path to diagnosis is clear cut. For example, an ankle sprain or a simple laceration. In these patients, you should move on with the workup and treatment and end with Stop Three (see below).

Complicated — Few possibilities, but some are potentially serious and the path to diagnosis is clear cut. Examples include appendicitis and clear cut acute coronary syndrome. In these patients, you should PAUSE and consider:

Are there any outlier symptoms? If no, continue your evaluation and end with Stop 3

If yes, you should continue with Stop 2

Complex — Many possibilities, some potentially dangerous, the path to diagnosis is not clear cut and even figuring out how to get started takes some thought. Examples include the dizzy patient or the patient with shortness of breath without obvious cause or undifferentiated abdominal pain. For these patients, before moving on, you should go to Stop 2.

SECOND STOP, or the cognitive pause. For complicated patients with outlier symptoms and complex patients, take some time out to focus on these symptoms or findings. To organize them, consider writing them all down and arranging them into pairs to jog the memory and broaden the possible diagnoses. This exercise may reveal patterns that are not obvious on first go-round when evaluating the patient. It is also helpful to keep a list of can’t-miss diagnoses as another memory jog. Dr. Wears suggests the following list:

Noninfectious: Subarachnoid hemorrhage, subdural hematoma, cerebellar CVA, aortic dissection, acute myocardial infarction, pulmonary embolism, abdominal aortic aneurysm, mesenteric ischemia, perforated viscous, small bowel obstruction/hernia, and appendicitis

Infectious: Sepsis, meningococcemia, Rocky Mountain spotted fever, meningitis, encephalitis, endocarditis, myocarditis, epidural abscess, necrotizing fasciitis

THIRD STOP. After evaluation is done and you have a working diagnosis, stop again. Ask yourself several questions — note that all are framed in the negative, to challenge your assumptions.

Why is my working diagnosis NOT correct? What doesn’t fit?

Is there a reason this can’t be an “embryonic illness”? (e.g., an illness that is too early in the course to be diagnosable?)

What if it IS an embryonic illness? That is, what will you say to the patient and his/her family after discharge to make sure you have informed them about the possibility that something is brewing that may develop after discharge?

The strength to this approach is that is allows the clinician to maximize appropriate use of “fast” thinking but builds in cognitive “stops” to emphasize the need for “slow” thinking in cases in which the risk is high.

Below are some additional suggestions that are collected from various resources to help minimize medical decision making errors.

  • Avoid the biggest obstacle to the correct diagnosis — a previous diagnosis.
  • Avoid inheriting someone else’s thinking whether it is related to diagnostic or personal bias.
  • Check for critical past medical history and risk factors for serious disease or poor outcome.
  • Pay attention to vital signs and nurses’ and Emergency Medical Service (EMS) notes.
  • Avoid premature closure if the diagnosis is not certain — enlist the patient as a partner in that uncertainty, arrange for appropriate follow-up, and give specific precautions in written form.
  • Beware of high-risk times-patient sign out (see and touch all), high-volume or high-acuity times, and times of personal fatigue.
  • Beware of high-risk patients — hostile, violent, or abusive patients, patients with alcohol or drug abuse, psychiatric patients, and patients who elicit a negative visceral response.
  • Beware of the return visit — this is an opportunity to correct what was missed during the previous visit.
  • Beware of high-risk diagnoses — myocardial infarction (MI), pulmonary embolus (PE), subarachnoid hemorrhage (SAH), tendon and nerve injuries, retained foreign bodies, intracranial hemorrhage (ICH) in intoxicated patients, vascular catastrophes in elderly patients, appendicitis, meningitis, ectopic pregnancy, and testicular torsion. Rule out the worst-case scenario or high-risk diagnoses.
  • Beware of the nonfit — when the presumptive diagnosis does not match the symptoms, signs, or diagnostic tests — recognize the nonfit and reevaluate and refine diagnostic hypotheses.
  • Sit at patient’s bedside to collect a thorough history.
  • Perform an uninterrupted physical examination.
  • Generate life-threatening and most likely diagnostic hypotheses.
  • Use information databases and expert systems to broaden diagnostic hypotheses.
  • Collect data to confirm or exclude life threats first, then most likely diagnoses.
  • Avoid diagnostic testing whenever possible by using readily available decision-making algorithms (e.g., Ottawa ankle rules).
  • Order only those tests that will affect disposition or that will confirm or exclude diagnostic hypotheses.
  • Include decision rules on diagnostic testing order forms.
  • Use guidelines and protocols for specific therapeutic decisions to conserve mental energies while on duty.
  • Allow 2 to 3 minutes of uninterrupted time to mentally process each patient.
  • Mentally process one patient at a time to disposition.
  • Avoid decision making when overly stressed or angry. Take 1 to 2 minutes out, regroup, then make the decision.
  • Carry a maximum of 4 or 5 “undecided” category patients. Stop — make some dispositions.
  • Use evidence-based medicine techniques to substantiate decisions with evidence, understand the limitations of the evidence, and to answer specific questions, such as usefulness of diagnostic testing, management plans and disease prognosis.

KEY POINTS AND RECOMMENDATIONS

  1. Medical errors account for nearly 100,000 deaths annually in the United States.
  2. Only 1.7% of medical errors occur in the emergency department.
  3. Most medical errors are not due to a lack of knowledge, but rather faulty decision making.
  4. Type 1 or “fast” thinking is intuitive, efficient, mentally easy and accounts for 95% of thinking.
  5. Type 2 or “slow” thinking is mentally taxing, exhaustive, and requires focus and resources.
  6. Great thinkers toggle back and forth between fast and slow thinking, and this toggling is crucial to minimizing medical error in the emergency department.
  7. There are well over a dozen types of cognitive errors, many of which apply in the practice of emergency medicine.
  8. Understanding that bias is harmful in medical practice is a vital step to minimizing cognitive errors.
  9. Practitioners must develop strategies to help minimize bias and cognitive errors, and there are several options to formalize these strategies into daily medical practice.

Author:

Diane M. Birnbaumer, MD

Emeritus Professor of Medicine,
David Geffen School of Medicine at UCLA
Senior Clinical Educator, Department of Emergency Medicine
Harbor-UCLA Medical Center
Past Recipient, ACEP’s Outstanding Contributions to Education Award

--

--