Diagnostic Reasoning: Can you trust your gut or are you biased?

SHARE:
POSTED BY:

(These are the show notes for the Grand Rounds I presented at Lurie Children Hospital of Chicago and Northwestern University

 We discussed diagnostic error previously on the ICE blog here and here. –Jonathan  (@sherbino)


BACKGROUND… OR WHAT TO READ FOR YOUR INSOMNIA
Diagnostic (or clinical) reasoning is a prominent theme in medical education.  The 2015 Institute of Medicine (IOM) report: Improving Diagnosis in Health Care reaffirmed the importance of this theme, highlighting the personal, professional and system costs associated with misdiagnosis.

The incidence of diagnostic error in medicine is significant. (See here for ambulatory care, here for in patient care and here for suggestions on how to improve surveillance).

The psychology and neuroscience of how a physician thinks is complex (and only superficially understood).  A dominant theory of diagnostic reasoning is a dual processing model.  (Variations on this model can be found here and here.  Evidence from neuroanatomy for a dual model can be found here and here. Evidence from neurophysiology can be found here).

Essentially, two systems work together to solve a problem.  (Some editorialists suggest that the systems work in series, while others argue from empiric evidence that the system works in parallel).

System 1 System II
•       Fast,

•       Intuitive

•       Inductive

•       Acquired through experience

•       Unconscious

•       Slow

•       Rational

•       Deductive

•       Analytical

•       Conscious

 

This model is grounded in the work of Daniel Kahnemann and Amos Tversky.  (Kahneman would receive the prize in economic sciences in memory of Alfred Nobel for his work; Tversky would die before the award, which cannot be given posthumously.) Of note, this research examined general reasoning and knowledge (and not technical or professional knowledge).

See below for other contributors and reviews of dual processing theory:
PMID:18154502
PMID: 14550493
PMID: 23802513
PMID: 19669918
PMID: 19669916
– and my favourite

The implications of the current, dominant model of dual processing is that System 1 is the source of error.  This means that intuitive, experience-based, gut instinct is the source of diagnostic error in medicine. “…errors of intuitive judgment involve failures of both systems: System 1, which generated the error, and System 2, which failed to detect and correct it.” (Kahnemann, 2004).  The discourse in medicine and medical education has veered towards mitigating the role of system 1 reasoning.  But here’s the problem.  The translation of general problem solving (e.g. Do more words start with the letter K or the letter L?) to the technical and specialized knowledge-based problem solving required of medicine may be flawed.

Take this classic problem devised by Kahnemann and Tversky.

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations.

Which is the most likely statement?

  1. a) Linda is a bank teller
  2. b) Linda is a bank teller and active in the feminist movement

The answer is A.  The probability of A is more likely than the combined probabilities of B. (PaPb < Pa or Pb).  However, if you ask the same question, but substitute the technical knowledge required of a physician, what is the appropriate response?

Rahim is a 55 year old male who presents to the emergency department with multiple injuries following a car accident. On examination he has diminished breath sounds on the left side and a tender abdomen. His blood pressure is 90/55 and his pulse is 135 beats per minute.

 Which is the most likely statement?

a) Rahim has a pneumothorax
b) Rahim has a pneumothorax and a ruptured spleen

The math still holds.  But, would you want to receive care from a physician that did not explore option B, reflecting experience and professional / technical knowledge.

I am not suggesting that biases don’t exist.  In a general environment of excessive information, where an individual possesses novice comprehension of an issue, a heuristic (i.e. cognitive shortcut) is a very appropriate and efficient way to operate.  However, in the specialized practice of medicine, I would argue that experience and technical knowledge counts for something. Maybe System 1 isn’t only about using shortcuts… maybe it’s giving us rapid access to past experience.  Maybe you can trust your gut?

A NECESSARY EVOLUTION IN DUAL PROCESSING THEORY? … OR WHAT TO READ IF YOU REALLY CAN’T SLEEP

Here are some of the findings from our research group that suggest that System 1 thinking may not be the boogeyman.

  1. The adjudication of diagnostic error may be a victim of hindsight bias

This study of physician members of the Society to Improve Diagnosis in Medicine used 50:50 cases, where a common stem was presented.  When the outcome of the case implied an incorrect diagnosis, the adjudication of a bias doubled and the adjudication of an error increased from 8% to 60%.  There was NO agreement between physicians on what bias had been committed.

  1. Going fast (a surrogate for System 1 thinking) is not associated with increased error (see here and here).

In these controlled trials, when one cohort was urged to go as fast as possible, while another cohort was exhorted to take their time and pay attention, the fast cohort was 23% quicker. (Umm… is that really publishable?) But…there was no difference in diagnostic accuracy.  Exhortations to “go slow” “be careful” “avoid error” make you go slow, but they don’t make you more accurate.

  1. Interrupting the diagnostic task and overloading working memory (which impairs System 2 thinking) does not change diagnostic accuracy.

Working memory has a finite capacity and interruptions could saturate the cognitive load of a physician, impairing the analytical, conscious reasoning of System 2. In this study, interruptions (including random, non-related medical trivia and responding to overhead pages) while diagnosing a clinical case had no effect on accuracy.

  1. Experienced clinicians are more accurate than junior clinicians… duh!
    (see here and here).

The work of Elstein & Shulman and Barrows, Neufeld & Norman, show that hypothesis generation between experts and novices does not significantly differ.  Rather it is the rapid testing, stratification and refinement of hypotheses (especially among experts) during the clinical encounter that leads to diagnostic accuracy.  Our work shows that more experienced clinicians are more accurate. (Yeah, we got a grant for that…) Perhaps experience allows for the development of more complex exemplars.  The rapid, non-analytical retrieval of these diagnostic exemplars (that become more sophisticated with experience) may reflect System 1 thinking.  (See the work of Medin and Brooks that advanced Exemplar Theory).

All of these findings suggest that System 1 – in the context of technical / specialized knowledge – may not be a cognitive pariah, rather it may reflect the process used by experienced clinicians.  Maybe our goal as medical educators is to promote the development of System 1 thinking in our trainees??

WHAT SHOULD A CLINICIAN EDUCATOR DO?
Much of the editorialization in medical education calls for curricula that emphasize cognitive biases and teaching strategies to mitigate against biases.  Implicit in the editorials is a promotion of System 2 thinking and avoidance of System 1 thinking – as if this decoupling of processes could be achieved.  Our research group (to my knowledge) has produced the only empiric evidence around teaching cognitive forcing strategies (i.e. metacognition).  (See here and here). The results show that teaching about a bias and suggesting an avoidance strategy doesn’t work.. and more worrisome.. is not generalizable across diagnostic scenarios (e.g. a chest pain case doesn’t translate to an extremity trauma case). (However, the test subjects were medical students, perhaps too novice to have developed strong System 1 process amenable to both bias and bias-reduction strategies).

However, there is some good news for educators.  Our group has shown that asking residents to reflect, in an unstructured manner, on their diagnosis leads to a marginal (at best) increase in diagnostic accuracy.  (See here). The work of Mamede and Schmidt (see here and here) suggest that there is an opportunity here.  However, their reflection algorithm presents issues of feasibility in the busy clinical environment.

  1. Write down the most likely diagnosis
  2. Write down alternative diagnoses
  3. List findings
    • Supporting
    • Against
    • Not present
  4. Rank diagnoses in order of likelihood

Our group is researching a new approach that structures the thinking of a clinician to consider (but not necessarily record) how to “ACT” when considering the final diagnosis.

In the interim, what else can we do?  From here on, it’s speculation. Some recommendations can be found here.

I suggest that enhancing System 1 should be the goal of educators, not developing protocols to reduce or override it. (e.g. “Trust your gut… as your experience develops”) After all, what is training (hopefully deliberate and corrective training) and years of clinical work than simply developing experience? The evidence suggests that System 1 reasoning is a marker of expert reasoning, giving us rapid access to past experiences to inform the current diagnosis.

How Doctors Think Book Image via Wikipedia

 

 

 

 

 

 

 

 

 

 

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.