How do doctors / nurses / paramedics / physician assistants etc. think? #clinicalreasoning

SHARE:
POSTED BY:

(This is a guest post from a co-investigator at the Program for Education Research and Development, McMaster University. – Jonathan (@sherbino))


By Sandra Monteiro (@monteiro_meded)

What can be done to improve clinical reasoning? Cognitive biases are often implicated when things go wrong in clinical reasoning (Croskerry PMID 12915363) but there are many misconceptions about what can be done about them.

Biases are general tendencies to rely on what appears to be minimal new information (Kahneman). For example, a confirmation bias is when a clinician only seeks out confirming evidence to support a single diagnosis (Croskerry). Premature closure involves making a decision without seeking further confirming or contrary evidence (Croskerry).  Of course these are only problems when the answer or the evidence is incorrect; and we do not really know how often that happens. Still the popular rhetoric surrounding clinical reasoning suggests that using heuristics (i.e. mental short cuts) are a lazy, error-prone approach to diagnosis. Clinicians should engage in more cognitive effort, reflective practice and de-biasing strategies.

This is where things get unnecessarily confusing.  There is no danger in a confirmation bias, an availability bias or any other kind of bias under the sun, if the diagnosis is correct. And there is little evidence that biases are innately error-prone. For emphasis, no there really isn’t evidence that biases lead to error… but there is ample cleverly crafted evidence that psychologists can manipulate you into making a mistake.

It’s one thing to look back in hindsight and identify mistakes and oversights that contributed to diagnostic errors, quite another to assume that those oversights are reliable sources of error, and a complete stretch of the imagination to believe that we can stop them from happening. We can learn from a mistake, but can we learn from a mistake we’ve never made before? Maybe, if we read a list of all the mistakes people have ever made … but wouldn’t it be a better use of time to learn about all the ways to get it right the first time? So why is there such a strong push to implicate cognitive biases in diagnostic errors?

Let’s start with the history of biases. They were defined by social psychologists well before Daniel Kahneman (with the help of Amos Tversky) made them his own. Early social psychologists were interested in group dynamics, including interpersonal relationships, responses to authority, development of beliefs and the influence of cognitive mechanisms on the development of racial bias and stereotyping. From social psychology studies, there is evidence that people are capable of making judgements based on limited knowledge … often those judgements are neither incorrect nor correct in the context of those studies. The concern arises when judgements based on limited information lead to the unfair treatment of people.

Social psychologists point to the driving force behind social biases and misconceptions: minimal attention, minimal knowledge and conflicting survival instincts. Often something new and foreign is treated with caution. This can lead to generalizations of that fear from one individual to an entire group… particularly if we are not paying attention. Of course we know that there are many more complex factors at play, as it takes great effort and the help of widespread media to maintain these fears. Therefore, the information we continue to attend to and trust will determine whether an initial bias grows into a large-scale belief. So can people take advantage of the mechanisms that lead to cognitive biases to further their own cause? Absolutely.

Reduction of both subtle and blatant bias results from education, economic opportunity, and constructive intergroup contact” (Fiske, 2002).

Do we believe the same factors are at play in diagnostic reasoning? I should hope not, yet the term bias has left its social psychology origins and infiltrated all discussion of decision-making and diagnostic reasoning. It would be nice to put it back in its place.

Daniel Kahneman imported the use of the word bias into his research on economic decision-making and some authors in medical education have expanded its reach to medicine. In general, the concept is thought to operate in the same manner… an initial response or thought resulting from minimal attention and minimal knowledge. But, in context of healthcare, I doubt we can include the driving force of survival instincts when discussing how bias influences a physician’s diagnostic accuracy, so we have to drop that one. I’m not a sociologist, but economic opportunity and constructive intergroup contact seem unlikely to influence diagnostic accuracy among health care providers.  All we are left with, then, is education.  More on that in part two…

 

Related Posts

An Ode to Summering…

By Michael A. Gisondi As I write this post, I am en route to one of my favorite places in the world, Provincetown, Massachusetts. If

Read More
Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.