#KEYLIMEPODCAST 296: Medicine discovers Moneyball

SHARE:
POSTED BY:

Analytics are all the rage in sports: the collection and analysis of player performance data, using statistical modelling, to develop insights and guide decisions that might not be obvious in a large data set .. hmmm, might this sound familiar to those working in CBME? Listen in to hear more about analytics in medicine..

————————————————————————–

KeyLIME Session 296

Listen to the podcast

Reference

Thoma et. al., Next Steps in the Implementation of Learning Analytics in Medical Education: Consensus from an International Cohort of Medical Educators J Grad Med Educ. 2020 Jun;12(3):303-311.

Reviewer

Jon Sherbino (@sherbino)

Background

** A brief word from our sponsors.  We have a policy at KeyLIME to never review a manuscript when one of the co-hosts is the principle or senior investigator.  Very rarely if one of the co-hosts is a middle author we will consider reviewing the paper if the findings are of general interest / relevance to the KeyLIME audience.  This is one of those times.  We endeavor to be as critical, yet fair, as possible in these instances, recognizing that unconscious bias may emerge in our critique.  So, we encourage you to form your own judgement and critically appraise the results of this paper, as you would with every episode.**

If you have read Moneyball (or seen the movie) or follow any of the major team sports leagues in North America (think baseball, football, basketball) you’ll know that analytics is all the rage.  General managers with a supposed “eye for talent” have been replaced with analytics – the collection and analysis of player performance data, using statistical modelling, to develop insights and guide decisions that might not be obvious in a large data set.  Does this analogy sound familiar in our current age of competency-based medical education?

Enter analytics and programmatic assessment.  As the collection of large data sets are being presented to competency committees, how can data be aggregated, interrogated and presented to faculty and leaners to inform education prescriptions and detail progression of training decisions and inform statements of competency.  If you’re a general manager, I mean program director, and your residency data just increased from 4 end of rotation reports and one practice exam per resident per year to 300 data points, you need a plan.  This is the problem that our paper tackles today.  Maybe it will be a slam dunk.  We certainly don’t want to strike out.  Ok.  I apologize.  Terrible sport aphorisms will officially stop now.  Unless we get to penalty time?

Purpose

From the authors:

“We sought to characterize barriers to the use of learning analytics techniques in medical education by identifying the questions of educators interested in this field.”

Key Points on the Methods

A two-day open, international summit of medical educators was held in 2017. Plenary sessions and workshops made up the meeting.  A one hour data collection session was held prior to the conclusion of the meeting.

A literature review informed an interview guide comprised of three themes – learner perspectives, program perspective, and data stewardship.  Three equal groups completed a facilitated discussion of two of three themes, to ensure redundancy (2/3 of attendees discussed each topic). Discussion was captured by individuals inputting responses via a digital polling software.

After the meeting an inductive content analysis was performed by three investigators.  The responses were reviewed independently; a list of codes was  generated.  Analysis of all responses was done independently with discrepancies resolved by consensus. 

Key Outcomes

67 participants from 6 countries (predominantly Canada and the USA) participated. During the 6 sessions, 195 unique responses were contributed.

Key Conclusions

The authors conclude…

“Our analysis highlights themes regarding implementation, data management, and outcomes related to the use of learning analytics in medical education. These results can be used as a framework to guide stakeholder education, research, and policy development that delineates the benefits and challenges of using learning analytics in medical education.”

Spare Keys – other take home points for clinician educators

The authors received ethics board exemption via the American Institutes for Research, an independent, not-for-profit REB without an institutional affiliation.  This is an interesting example for investigators conducting research unconnected with their home academic institution.

Access KeyLIME podcast archives here

The views and opinions expressed in this post and podcast episode are those of the host(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.