Cognitive Bias

5 Cognitive Biases That Cause the Most Diagnostic Errors

Lauren Fine, MD, FAAAAI · Associate Professor of Medical Education, NSU KPCAM · 6 min read · April 2026

Diagnostic errors cause harm in approximately 12 million Americans each year. Most aren't caused by lack of knowledge. They're caused by predictable, identifiable failures in reasoning. Here are the five you need to know — and how to defend against them.

In 1999, the Institute of Medicine's landmark report To Err is Human estimated that up to 98,000 Americans died annually from preventable medical errors. Two decades later, diagnostic error has emerged as perhaps the largest single contributor to patient harm — larger than medication errors, larger than surgical complications.

What's striking is that most diagnostic errors aren't caused by rare diseases, complex presentations, or insufficient technology. They're caused by cognitive shortcuts that every clinician's brain takes, every single day. Understanding these shortcuts — and building habits to counter them — is one of the highest-leverage skills you can develop as a clinician.

Bias 1

Anchoring Bias

Anchoring is the tendency to rely too heavily on the first piece of information encountered when making decisions. In clinical medicine, it manifests as fixating on an initial diagnosis and interpreting all subsequent data through that lens — even when the data should be redirecting you.

"The triage note said 'anxiety attack,' and even when her oxygen saturation was 88%, I kept thinking about how anxious she seemed. I didn't order the CT for PE until her third visit."

Anchoring is amplified by the way information is handed off in medicine. The presenting diagnosis on the triage note, the diagnosis written by the referring physician, the label a patient carries from a previous admission — all of these set an anchor that distorts subsequent reasoning.

Antidote When receiving a patient, deliberately ask: "If I didn't know what the last clinician thought, what would I think based solely on this patient's data?" Re-derive the diagnosis from first principles at least once per encounter.

Bias 2

Premature Closure

Premature closure is stopping the diagnostic process too early — accepting the first plausible diagnosis without adequately considering alternatives. It is the single most common cognitive error identified in studies of diagnostic error, contributing to an estimated 40% of cases.

"He came in with a productive cough and fever. Chest X-ray showed a right lower lobe infiltrate. I treated him for pneumonia. He came back three days later still sick — and then we found the lung mass."

The danger of premature closure is that it feels like good clinical reasoning. You found a diagnosis that fits. The cognitive discomfort of uncertainty goes away. But the diagnostic process should close when the evidence is overwhelming — not when you've found the first satisfying explanation.

Antidote Before committing to a diagnosis, explicitly ask: "What else could this be? What findings are inconsistent with my diagnosis? If this patient doesn't improve as expected, what would I reconsider?"

Bias 3

Availability Bias

Availability bias causes clinicians to overestimate the probability of diagnoses that come easily to mind — often because they were recently encountered, widely discussed, or particularly memorable. The more "available" a diagnosis is in memory, the more likely we are to consider it first and weight it too heavily.

"I'd just seen three cases of pulmonary embolism in the ICU that week. When the next young woman came in with pleuritic chest pain, I was so focused on PE that I almost missed the pneumothorax."

Availability bias is particularly dangerous after high-profile cases, after reading about rare diseases, and during outbreaks — when the most available diagnosis in memory may not be the most likely diagnosis for this specific patient.

Antidote Ground your probability estimates in base rates, not memory. Ask: "What is the actual prevalence of this diagnosis in a patient with this presentation?" Use clinical decision rules (Wells score, HEART score) to anchor probability objectively.

Bias 4

Framing Effect

The framing effect refers to how the way information is presented influences clinical judgment — independent of the actual content. The same patient presentation, framed differently, leads to different diagnostic and management decisions.

"The consultant's note said 'anxious young woman with palpitations.' I almost missed the WPW because I was already thinking about this as a psychiatric presentation before I even met her."

Framing effects are embedded in every handoff, every referral note, every triage label. They're particularly powerful when they involve demographic assumptions — age, sex, race, socioeconomic status — that activate implicit biases about who gets which diseases.

Antidote Practice reframing: after reading any handoff note, rewrite the presenting problem in purely objective, symptom-based terms before seeing the patient. Strip out interpretive language and demographic assumptions.

Bias 5

Confirmation Bias

Confirmation bias is the tendency to seek, interpret, and recall information in a way that confirms a pre-existing belief. Once a diagnosis is formed, clinicians unconsciously favor data that supports it and discount data that contradicts it.

"His troponin was mildly elevated, but I attributed it to demand ischemia from his sepsis. His ECG had subtle changes, but I thought they were baseline. In retrospect, I was explaining away the evidence for ACS because I had already decided he had urosepsis."

Confirmation bias is most dangerous in patients with complex comorbidities, where there is always an alternative explanation available for any discordant finding. The bias provides a ready supply of alternative explanations that protect the working diagnosis from revision.

Antidote Actively seek disconfirming evidence. After forming a working diagnosis, explicitly ask: "What finding would make me abandon this diagnosis? Have I looked for that finding?" This is the foundation of the Devil's Advocate reasoning framework.

The Common Thread

Notice what all five biases have in common: they feel like good reasoning in the moment. Anchoring feels like efficiency. Premature closure feels like decisiveness. Availability feels like pattern recognition. Framing feels like context-sensitivity. Confirmation feels like thoroughness.

This is what makes cognitive bias so dangerous — and why simply knowing about biases doesn't protect against them. The protection comes from building deliberate reasoning habits that create friction at the right moments, forcing your brain to slow down before it commits.

The Research: Studies of diagnostic error show that most cases involve multiple biases operating simultaneously. Anchoring sets the frame; confirmation bias filters the data; premature closure ends the process. Debiasing requires targeting all three, not just one.

How to Practice Debiasing

Like all clinical skills, debiasing requires deliberate practice — not just awareness. The most effective approach is to practice on simulated cases where you can get feedback on your reasoning process, not just whether you got the right answer.

When you practice with ReasonDx, the AI coaching system is specifically designed to probe for these biases in real time. When you anchor on a diagnosis, it asks: "What findings are inconsistent with that?" When you close prematurely, it asks: "What else could explain this presentation?" When you seek confirming evidence, it asks: "What would change your mind?"

Train Your Brain to Avoid These Biases

ReasonDx's AI coaching system uses Devil's Advocate, FMEA, and other reasoning frameworks specifically designed to counter cognitive bias in real time.

Start Practicing Free →