Digital Photographs, Mosaic Murals and CBME

SHARE:
POSTED BY:

By: Karen Schultz

The goal of competency-based medical education (CBME) is to graduate physicians assessed to be competent to do the work of their chosen specialty.  For programs to have confidence in that competence involves authentically assessing those physicians-in-training doing their work, i.e. work-place based assessment (WBA).

As a program director involved in implementing CBME in our family medicine program 11 years ago it was a challenge figuring out how to do WBA in an authentic trustworthy way. Entrustable Professional Activities (EPAs) proved helpful in addressing this authenticity challenge on two fronts.  First, considering EPAs for FM at a time when none had been articulated pushed us to define those “professional activities” (i.e. what exactly was the work of our specialty?) and outlined a way for us to assess those by tapping into supervisors’ sense of trust in the person they were supervising doing that work.  EPAs also took frameworks that had broken down the work of a physician into components (e.g. the Can MEDS roles of medical expert, communicator, collaborator, advocate, leader, scholar and professional), which were helpful but artificial to measure in isolation, and reconstituted those components  back into the actual work that physicians do.  As work, this was now something observable and therefore assessable.  

Defining EPAs, and with them the work we wanted to observe and assess, was a crucial first step.  We then needed a way to reliably (or perhaps better characterized as in a trustworthy way, given WBA is in essence a qualitative exercise wherein expert assessors gauge performance) assess those EPAs.  This meant many observations over time and in different contexts.  This would give us reassurance that the resident’s competency to do that activity was durable—sustained over time and able to withstand the uncertainties of clinical work where variables change and knowledge and skills need to be adaptable.  It also meant having multiple assessors involved, both to embrace those preceptors’ different perspectives while mitigating the impact of their inherent biases. 

So, where do digital photographs fit into this conversation about CBME?  Just as a digital photograph is  better defined the more pixels there are making up that photograph, so to the “picture” of the competent resident is more sharply defined by more assessment data bits, over time, contexts and across all the EPAs.  Power to the pixels!

This was where we got to with our first iteration of WBA: Define the EPAs; provide faculty development (lots and lots of faculty development!) about the importance of constructive, behaviourally based, frequent, documented feedback and reassure preceptors that one bit of “negative” (better termed constructive) feedback was only one pixel, not a full picture; and set up a system to collect, collate and interpret those pixels/assessment date looking for patterns and trajectory.

In time though, and in discussion with colleagues at the College of Family Physicians of Canada, we realized that there were other elements that we as educators needed to be deliberately focussing on and developing curriculum and assessment standards for that went beyond individual EPAs or professional frameworks.  These are the elements that “glue” the competencies needed to do those professional activities into place.  They are the things that reassure us that once out into the self-regulated, every changing environment of clinical practice our newly minted graduated physician will continue to grow, develop and safely practice when their competencies demonstrated by graduation are now out-of-date or something new (like a COVID pandemic) has rocked their clinical world.  These are things like showing the ability to be adaptable, manage uncertainty, be self-regulated in their learning, be resilient, and be able to hear, interpret and incorporate data about their performance to improve or change their performance.  Many of those things don’t appear in the moment of observing and assessing a single performance but become apparent in the space between those activities.  Did the resident incorporate feedback into their next encounter, how did they respond to the uncertainty of a previous patient encounter, how did they cope with adversity?  These competencies are like the mortar between the tiles in a mosaic picture.  Often that mortar, usually white or gray in colour, is overlooked, fading into the background. But it is crucial—without it, as soon as there is a tremor, as there will be in practice, the whole picture distorts and falls apart.   Some educators are starting to figure out the ingredients for that mortar1. Building on this work will be the next very interesting iterative phase of CBME—deliberately articulating the curriculum and assessment standards for…what is the right word?—some colleagues have suggested  “interstitial competencies” that hold together our developing next generation of physicians.

About the author: Karen Schultz BSc, MD, CCFP is Associate Dean, Postgraduate Medical Education at Queen\’s univeristy School of Medicine.

Reference:

Cooke S, Lemay J-F. Transforming medical assessment: Integrating uncertainty into the evaluation of clinical reasoning in medical education. Acad Med 2017; 92 (6): 746-751.

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our \’About\’ page

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.