#KeyLIMEPodcast 70:  Can Google give you a reliable answer for your clinical medical question?

SHARE:
POSTED BY:

At work last night, I listened to a lengthy presentation from Amira*, a first year resident, regarding a female patient with a two week history of a painful, red, raised area on her shin.  (Guess the diagnosis?  Yep… erythema nodosum.)

So, after letting Amira diligently work through her differential, I asked if she had considered erythema nodosum. “Umm…” So, off we went to actually examine the patient.  Our walk across the emergency department was punctuated by a rapid back and forth, which covered inflammatory bowel disease, malignancy, drug reactions, etc.

During the exam, it was revealed that the patient was pregnant. “Umm…” This time I was stuck.  Recalling that good teachers reveal gaps in their clinical knowledge to trainees and model how to research a point-of-care question, I directed Amira out of the patient’s room and to a nursing station.  Plunking down into a chair in front of a computer, I paused briefly, before confessing that I had no idea if pregnancy was an etiology of erythema nodosum.

“Umm…”  (Again me. But thankfully, this time internally.)  The department was incredibly busy. I was tired.   We needed a quick answer to help guide the plan for follow-up.  The effort of a PubMed search, followed by the critical appraisal of a primary source was too much. I briefly entertained the idea of making Amira do the work, but she seemed as fatigued as me, and patient disposition would still be delayed.  So, I typed my question into Google. I felt guilty.

0.37 seconds later I had my (from 401,000 options) answer.

The Key Literature in Medical Education abstract below describes whether I should trust what I found.  To listen to the actual podcast, search for KeyLIME on iTunes.

– Jonathan (@sherbino)

PS: Yes, pregnancy does cause erythema nodosum.


 

KeyLIME Session 70 – Article under review:

Listen to the podcast

View/download the abstract here.

Kim S, Noveck H, Galt J, Hogshire L, Willett L, O’Rourke K. Searching for Answers to Clinical Questions Using Google Versus Evidence-Based Summary Resources: A Randomized Controlled Crossover Study. Academic Medicine 2014, 89 (6): 940-3.

Reviewer
Jonathan Sherbino (@sherbino)

Fri post_KeyLIME live

Background
Previous research (see the manuscript for the references) suggests that a question that arises during clinical care is often left unanswered. Questions that are pursued are answered in less than two minutes. Of course, the pertinent question is whether this a function of the efficiency of search strategies, the press of clinical responsibilities or ultra-rapid search fatigue of the clinician.

Information technologies have changed the rules of the game for clinical search strategies. When I started training, questions were answered by leafing through textbooks (often, outdated editions). And then came Wikipedia. When I first started as an attending physician, I would castigate learners who used this resource, cautioning about the veracity of crowd sourced editing for technical/professional content. But, that’s all changed. Wikipedia now uses a hierarchical, crowd sourced approach that requires editors to demonstrate competence in an area before being permitted to edit material.

With high-quality on-line information resources using similar safe guards (but acknowledging obvious inaccurate or fraudulent resources), should I now turn to “Dr Google” to help my patients?

Purpose
“To compare the speed and accuracy of answering clinical questions using Google versus summary resources.”

Type of paper
Randomized controlled cross-over trial

Key Points on the Methods
• Randomized controlled cross-over trial
• n=48
o Two subsequent years of internal medicine interns
• Google search v. three commercial second-order expert summary resource
o EBSCO’s DynaMed, Wiley’s Essential Evidence Plus, Elsevier’s FirstConsult
• 10 clinical vignettes
o cross over between search strategies for half of the questions

Key Outcomes
• 393/480 (82%) questions answered (4 min / question time limit)
• No significant difference was found in mean time to correct response (138.5 seconds for Google v. 136.1 seconds for summary resource; P = .72).
• Mean correct response rate was 58.4% for Google v. 61.5% for summary
resource (mean difference −3.1%; 95% CI −10.3% to 4.2%; P = .40).
• Google search used a broader number of resources
– commercial medical portals (e.g. Medscape or eMedicine (26%),
– hospital Web sites(13%),
– Wikipedia (12%),
– government Web sites (e.g. CDC.gov) (9%),

Key Conclusions
The authors conclude…“The authors found no significant differences in speed or accuracy between searches initiated using Google versus summary resources.
Although summary resources are considered to provide the highest quality of evidence, improvements to allow for better speed and accuracy are
needed.”

It should be noted that all of the interns in this study underwent a two week evidence-based medicine course prior to this study. Therefore, they were sophisticated consumers, unlikely to turn to Jenny McCarthy’s blog on the danger of vaccines.
Spare Keys – other take home points for clinician educators
The cost of subscription-based (commercial) resources may be a barrier to accessing expert reviews. It is encouraging to see that the FOAM (free open-access medical education) movement is improving access to high quality clinical information. Clinician Educators should consider how they can contribute to the FOAM movement.

Access KeyLIME podcast archives here

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.