The Doctor’s Advocate | Third Quarter 2023

Thinking About How We Think: How Implicit Bias Creeps Into Diagnosis

David L. Feldman, MD, MBA, FACS

When patients experience adverse events, it is difficult to look back and determine exactly medical professionals were thinking, but we know that cognitive and decision-making processes have been identified as loci for diagnostic errors—which contribute to 10 percent of patient deaths.1 Therefore, it is worth attempting to illuminate certain otherwise invisible errors in our thought processes.

With collaborators at Candello—which, like The Doctors Company, keeps a national database of medical professional liability claims—I have investigated the factors contributing to malpractice claims that might leave clues implicating implicit bias, a common but often unrecognized form of cognitive error. My collaborators and I have gathered some insights, and it is our goal to help clinicians cultivate greater awareness of their thought processes during diagnostic discovery. Developing awareness of how implicit biases and other cognitive errors can creep in—even among clinicians who actively endeavor to keep their thinking clear—is the first step to overcoming them.2

Two Ways of Thinking

Daniel Kahneman, PhD, psychologist and author of Thinking, Fast and Slow, has given us a two-track model for discussing types of thought:

System 1, Fast and Intuitive

  • Autonomous
  • Context dependent
  • Qualitative
  • Error prone

System 2, Slow and Rational

  • Deliberate
  • Objective
  • Scientific
  • More accurate

Each system has its merits. Most adults make thousands of decisions per day, so without mental shortcuts, life would grind to a halt. Yet fast thinking is prone to error, so we need deliberate methods to overcome gaps in our perception, processing, or reasoning. In other words, System 1 gives us intuition, but also cognitive and personal biases, so System 2 monitors for and overrides those biases as needed.

Spotting Biases in Diagnostic Error

Biases may be cognitive or personal, explicit or implicit. Here are some common examples:

Cognitive Biases: Many cognitive biases derive from a desire to help the patient or from simple haste or fatigue. These errors are easy to understand but important to catch. Examples include:

  • Anchoring: When a clinician “anchors” on one diagnosis early in the patient encounter—and then fails to take in subsequent disconfirming information. Related terms include premature closure, triage cueing, and diagnosis momentum.
  • Affective error: When we like the patient, we would rather convey the less severe diagnosis. Dislike of the patient can skew our thinking also, as can other forms of emotional attachment to outcome.

Personal Biases: These include biases for or against people based on their perceived gender, race/ethnicity, weight, or socioeconomic status. Personal biases may be:

  • Explicit biases: The possessor will proclaim these loudly and proudly.
  • Implicit biases: We may like to think we don’t have these. Implicit biases derive from our cultural surroundings, and they invade our thinking so quickly that we have to work to notice and counteract them.2

Case Example: A female patient in her 40s with a history of migraines arrived at the emergency department (ED). The paramedics, who said they had been called to a child’s birthday party, described the patient as anxious and stressed. Her EKG, labs, and CT were normal. The ED physician observed mild to moderate expressive aphasia and right arm drift. The patient reported a throbbing left-side headache, along with numbness and tingling in her right arm, so the ED physician initiated a stroke alert. After examination, however, the neurologist recommended against tPA and noted, “more likely migraine w/panic attack than stroke.” The patient received Reglan, Benadryl, and Toradol and was discharged home.

Less than two hours later, the patient was brought back to the ED after she fell over backward and was unable to speak. CT angiography revealed a complete bilateral carotid artery occlusion/dissection. She had suffered a large middle cerebral artery stroke on the left side and a small stroke on her right side. She underwent right carotid artery stenting, and now requires a percutaneous endoscopic gastrostomy (PEG) tube due to swallow dysfunction. She experiences continued aphasia and significant weakness on her right side. The case settled.

Analysis: Gender stereotyping may have contributed to this patient’s poor outcome, in the form of an assumption that women are more emotional. The patient’s history of migraine, combined with the paramedics’ description that the patient was anxious and stressed, may have created triage cueing, whereby someone’s initial impression is weighted too heavily by those completing subsequent examinations.

We might also look to other well-worn cognitive biases like confirmation bias, diagnosis momentum, or premature closure—any of which would involve some form of standing by an early idea (migraines/anxiety versus stroke) without making sufficient efforts to challenge that idea with disconfirming evidence.

Mitigation Strategies

We need a strong attack strategy for improving patient safety and mitigating provider liability risks. Fortunately, becoming aware of how quickly cognitive and personal biases, both explicit and implicit, can affect us really is the first step: Those who know these traps are more likely to realize when they are falling into them. We can overcome System 1 flaws with the more reflective skills of System 2.2 Strategies include the following:

  • Individuate: Simply make a conscious effort to focus on specific information about this individual versus letting attention get hijacked by preset expectations.
  • Consider alternatives: Deliberately ask, “What else could this be?”
  • Disconfirm: Look for aspects of the current situation that do not fit.
  • Complete a differential diagnosis, redux: Consider three or more possible differentials, such as most severe, most probable, most interesting, and most treatable.
  • Think fast: Try taking one of Harvard’s implicit bias tests

High-priority goals for many healthcare practices and institutions include improving diagnostic safety, mitigating liability risks, and combating healthcare disparities. We can move toward all three with one step, and it’s as easy—and as hard—as thinking about how we think.

Our thanks to Candello for providing the case example. Candello, established as a division of The Risk Management Foundation of the Harvard Medical Institutions Incorporated and CRICO, pools medical malpractice data and expertise from captive and commercial professional liability insurers across the country to provide clinical risk intelligence products and solutions. Copyrighted by and used with permission of The Risk Management Foundation of the Harvard Medical Institutions Incorporated. All rights reserved.


  1. National Academies of Sciences, Engineering, and Medicine. Improving diagnosis in health care. Published December 2015. Washington, DC: The National Academies Press. doi:10.17226/21794
  2. Addressing bias in healthcare: solutions for racial and ethnic disparities. TDC Group. Published March 2023.

The Doctor’s Advocate is published by The Doctors Company to advise and inform its members about loss prevention and insurance issues.

The guidelines suggested in this newsletter are not rules, do not constitute legal advice, and do not ensure a successful outcome. They attempt to define principles of practice for providing appropriate care. The principles are not inclusive of all proper methods of care nor exclusive of other methods reasonably directed at obtaining the same results.

The ultimate decision regarding the appropriateness of any treatment must be made by each healthcare provider considering the circumstances of the individual situation and in accordance with the laws of the jurisdiction in which the care is rendered.

The Doctor’s Advocate is published quarterly by Corporate Communications, The Doctors Company. Letters and articles, to be edited and published at the editor’s discretion, are welcome. The views expressed are those of the letter writer and do not necessarily reflect the opinion or official policy of The Doctors Company. Please sign your letters, and address them to the editor.