|Home | About | Journals | Submit | Contact Us | Français|
Iatrogenic adverse events are a major cause of patient harm, even mortality, and their study and subsequent prevention can substantially improve patient safety, quality of care and health outcomes. Diagnostic errors were included in that definition but in sharp contrast with treatment errors they have received comparatively little attention,1 partly because they are harder to study. For example, published case reports highlight successes and conceal circuitous routes, blind alleys, untoward delays and interim errors. Research on diagnostic mistakes is always retrospective and data are mostly selective (such as comparing clinical diagnoses to autopsy results, analyzing malpractice claims or performing surveys).2 Moreover, evaluation is prone to hindsight bias and common diversity in rating by different experts. Arguably, studying ‘trigger events’ like an unexpected hospital admission after an index ambulatory visit or discharge may uncover errors in diagnosis and better reflect their ‘real-life’ spectrum. Other research may be too narrow (e.g. case reviews limited to a single diagnosis) or indirect and ‘artificial’ by nature, studying the diagnostic process in pre-determined scenarios using standardized patients or constructed case presentations to examine potential pitfalls in diagnostic reasoning leading to errors.
As a result of these varied research approaches, we know that the widely held presumption that the astounding advances in imaging techniques, endoscopies and laboratory testing have made diagnosis today almost infallible is untrue. Diagnostic errors (incorrect, missed or unnecessarily delayed diagnoses) continue to occur and are far from being rare: their estimated incidence is 10–20%.3 Neither are they innocent, as erroneous diagnosis had been associated with preventable mortality in 9% of autopsy cases reviewed2 and estimated to account for up to 80000 deaths per year and substantially more patient harm.1 Classified into three etiological groups—‘no fault’; system-related errors and cognitive (‘physician personal’) mistakes,4 research has rightly focused on the latter. Faulty judgment and not defective knowledge appears to be the predominant mechanism and the study of errors in diagnosis is inseparable from that of diagnostic reasoning and decision-making. Fascinating functional magnetic resonance imaging studies have confirmed two distinct cognitive techniques used by clinicians. The first is intuitive, rapid (so called, ‘augenblik’—at the blink of the eye) ‘pattern recognition’ which is very effective, more so with time and experience, although it depends on heuristics (mental shortcuts) which are highly prone to distraction by contextual factors and an array of biases.5,6 These include availability, framing, anchoring, confirmation and many other biases which are subconscious and hard to avoid. Distracting contextual biases are common too, such as age, ethnicity or chronic overriding illness.5 The other, ‘hypothetico-deductive’ method is slower, more laborious, and analytical, usually evaluating 3–5 possible diagnostic alternatives and less prone to error.3,6 In fact, clinicians frequently manipulate between the two approaches to advantage (‘dual processing’)7 but uncertainty is inherent and susceptibility to error remains. Certainly, more complex diagnostic problems are relegated to the ‘hypothetico-deductive’ approach, whereas simple routine presentations can be dealt with almost automatically. The problem begins when an unusual or multifactorial problem masquerades as a common simple one. It is here that errors may be abundant, often due to multiple operative factors (Table 1). These tend to occur concurrently and often, multiple factors (5.9 per case) can be identified with a predominance of cognitive error.4,7
How can diagnostic errors be minimized? Looking at the major prevalent barriers to a correct and timely diagnosis (Table 1), several system-level interventions are suggested which can be readily addressed by ongoing improvements in information technology and scheduling enough time for each patient encounter. Necessary educational changes include fostering Curiosity as an antidote to burnout and facilitator of patient-based learning and reflection.8 However, prevention efforts have focused on the central role of cognitive failures in misdiagnosis and methods for their improvement. The use of checklists or diagnostic decision support tools,9 as well as teaching ‘debiasing’ techniques have been tried with variable results.6 Meanwhile, adoption of several habits in our daily practice seems a prudent and expert-supported means3–5,7,9 of addressing the ubiquitous risk of diagnostic errors in either hospital or ambulatory patients. First, ‘be systematic’ and patient-oriented in data collection. Listen to the patient, then proceed in an orderly fashion through the time-honoured elements of the history, examination and review of the tests. With experience and self-training this can be accomplished in minutes. Second, keep an open mind to alternatives, avoiding ‘premature closure’, our most common error. ‘Always ask—what else can it be?’ Be tuned to any significant, hard to explain deviation from the expected ‘script’ (or ‘gestalt’) for your diagnosis. And once identified, slow down! Third, have much ‘respect for the individual patient’s pre-test probability’ (alias: risk factors; alias: susceptibilities). The family history, occupation, past illness/drugs/procedures and lifestyle/pet/travel—is a frequent harbinger of subsequent illness, even when not apparent at first. Fourth, look it up: adopt a habit of ‘reflexive consultation’ with information databases and colleagues. Fifth, ensure checking on belated test results and ‘getting regular feedback’ on patients’ outcomes, comparing them to your own thoughts. When discrepant, reflection begets improvement. Sixth, ‘maintain a wide angle’ encompassing all your patient’s problems, not just the most glaring complaint. Seventh, ‘when you do not know, admit it’ to yourself and to your patient.10 Honesty is not only the best policy but a strong motivator for increased effort and thoroughness.
In conclusion, diagnosis is the most critical of physician's skills and in common with other physicians’ tasks, susceptible to overestimation by doctors of their performance.7 Overcoming this prevailing overestimation (and resulting overconfidence) is a mandatory first step. Diagnostic errors are universal, not uncommon and dangerous to the patients. Deeper understanding of the reasoning processes leading to diagnosis and the pathogenesis of diagnostic error is needed. Adoption of reflexive habits that may decrease the risk of error will improve patient safety and allow timely and correct treatment that will improve health outcomes.
Conflict of interest: None declared.