Rekindling Curiosity About the O.S.C.E. for CBME

SHARE:
POSTED BY:

By: Christopher Feddock and Michael Barone (@BaroneMichael)

Much of the current discourse in CBME centers on assessment.  Attention mostly focuses on observational workplace-based assessments (WBAs).  The argument for these types of assessments is strong given CBME’s commitment to meeting patient needs and its outcomes-orientation.  Observational WBAs have challenges however, among them the contextual variability of clinical settings which can lead to problems sampling certain content-specific competencies, as well as problems with standardization should it be desired for decision making. There is also the challenge of faculty rater training and the complexities of mitigating bias when increased numbers of faculty participate in assessing learners. 

While the CBME community’s work continues to optimize WBAs, how do we address any gaps that even the most optimal WBA program may have?  We’ve been revisiting the use of the formative OSCE (Objective Structured Clinical Examination) as an essential component of an overall program of assessment.  

The rationale for doing so is not new. The original development of the OSCE was prompted to address the limitations noted above, such as variation in faculty scoring, lack of clarity on assessed constructs and variability of patient problems.1  More than 20 years ago, CBME leaders cited OSCEs as complementary to any CBME program.2  Despite this, OSCEs have not consistently been integrated into CBME programmatic assessment for multiple reasons. First, the creation of a robust OSCE program is operationally complex, requiring space, faculty expertise, staff expertise, and sufficient funding.  Second, interactions with standardized patients are often perceived as inauthentic, requiring the learner to suspend disbelief.  Last, OSCEs have traditionally been positioned as high-stakes assessments in medical regulatory programs (e.g., USMLE Step 2 CS, MCC QE2, PLAB 2), which seems to have driven negative student perception of the format, particularly the accuracy of ratings and validity of scores.  

We’ve recently committed to embarking on some co-creation work with US Medical Schools and learners related to formative assessment on a specific construct (clinical reasoning) using the OSCE format.  As we planned this project, we considered how OSCE, when focused on assessment for learning, could support CBME’s Five Core Components3 thereby supporting overall CBME implementation efforts.  The following represents our thoughts on the role of OSCE related to CBME’s Five Core Components.  

An Outcomes Competency Framework

As a performance assessment, OSCE design is based on the purposeful blueprinting to multiple competencies, settings, and patient presentations. This structure permits both a wide sampling of skills and the opportunity for multiple observations of each skill.  Incorporating entrustment ratings into OSCEs has been found to be highly reliable at discriminating between learners at different levels of training.4 Lastly, formative OSCEs can provide a safe venue to introduce the concept of entrustment to early learners, whereas entrustment ratings in clinical settings may overwhelm early learners building foundational skills.

Progressive Sequencing of Competencies

Regular use of OSCE during training can document progression and milestone achievement on sequential competencies over time.4,5 OSCEs can also identify learners not meeting minimum performance thresholds for an EPA and thereby direct remediation.6,7 The feedback provided can help educational programs tailor learning experiences to support competency development.  OSCEs also offer the advantage of being able to focus on a subset of competencies or specific parts of the clinical encounter, which has been shown to enhance feedback.In contrast, assessing multiple aspects of performance typical of a WBA, while authentic, may impede faculty providing specific actionable feedback. 

Learning Experiences Tailored to Competencies and Teaching Tailored to Competencies

There is value in learners completing a formative OSCE, receiving feedback and then generating specific actionable goals for improvement.5,9,10  A learner-faculty debrief after an OSCE can assist learners’ use of individual and aggregate performance data to make proper conclusions on strengths and gaps, and goal set appropriately.9 OSCE performance data can also be an important input into coaching programs, enabling coaches to tailor learning experiences to specific individual needs. One study showed that learners who reviewed OSCE performance with a coach tended to improve their performance on subsequent OSCEs and nearly all learners reported changing their approach because of the coaching.10 Lastly, OSCEs can provide valuable competency-driven learning experiences when there are limitations in the clinical settings.  The COVID-19 pandemic altered clinical experiences and by extension opportunities for WBA. In contrast, many institutions successfully adapted their OSCEs to a virtual format, thus preserving their competency assessments (although with limited physical exam).11  

Programmatic Assessment

Programmatic assessment is deliberately structured to inform competency and advancement decisions on a body of evidence across many assessment methods and observations.12,13 A judgment about whether a learner can gather a patient’s history under reactive supervision should be based on multiple observations from both workplace-based assessments and structured performance assessments.  As Hall and colleagues noted, the complexity of CBME requires a careful tracking of outcomes important to both learners and the system itself.14 OSCEs provide valuable data to inform educational outcomes for individuals (micro) and programs (meso) while learners are in training. 

In the original description of CBME’s core components, Van Melle and colleagues noted, “we chose a reform perspective in which all curricular elements, not just assessment practices, are described as being integral to the change process.”3 As programs implement CBME, OSCEs can provide an important mechanism to support the progressive sequencing of competencies and the tailoring of teaching and learning experiences. As our OSCE co-creation work continues, we’re committed to helping medical schools increase the value of their evolving OSCE programs to measure and promote important competencies. We see this as an important part of overall CBME implementation. 

———————————————————————————————————-

About the authors:
Christopher A. Feddock MD MS FAAP FACP is Associate Vice President, Competency-Based Assessment at NBME and Executive Director of the Clinical Skills Evaluation Collaboration (CSEC).
Michael A. Barone BS MD MPH is Vice President of Competency Based Assessment on the National Board of Medicine Examiners as well as Associate Professor of Pediatrics at Johns Hopkins University.

References

1. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975 Feb 22;1(5955):447-51. doi: 10.1136/bmj.1.5955.447

2. Carraccio C, Englander R. The objective structured clinical examination: a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med. 2000 Jul;154(7):736-41. doi: 10.1001/archpedi.154.7.736

3. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A Core Components Framework for Evaluating Implementation of Competency-Based Medical Education Programs. Acad Med. 2019 Jul;94(7):1002-1009. doi: 10.1097/ACM.0000000000002743

4. Halman S, Fu AYN, Pugh D. Entrustment within an objective structured clinical examination (OSCE) progress test: Bridging the gap towards competency-based medical education. Med Teach. 2020 Nov;42(11):1283-1288. doi: 10.1080/0142159X.2020.1803251

5. Pugh D, Desjardins I, Eva K. How do formative objective structured clinical examinations drive learning? Analysis of residents\’ perceptions. Med Teach. 2018 Jan;40(1):45-52. doi: 10.1080/0142159X.2017.1388502

6. McMurray L, Hall AK, Rich J, Merchant S, Chaplin T. The Nightmares Course: A Longitudinal, Multidisciplinary, Simulation-Based Curriculum to Train and Assess Resident Competence in Resuscitation. J Grad Med Educ. 2017 Aug;9(4):503-508. doi: 10.4300/JGME-D-16-00462.1

7. Curran ML, Martin EE, Thomas EC, Singh R, Armana S, Kauser A, Zaheer EA, Sherry DD. The pediatric rheumatology objective structured clinical examination: progressing from a homegrown effort toward a reliable and valid national formative assessment. Pediatr Rheumatol Online J. 2019 Feb 8;17(1):5. doi: 10.1186/s12969-019-0308-7

8. Tavares W, Sadowski A, Eva KW. Asking for Less and Getting More: The Impact of Broadening a Rater\’s Focus in Formative Assessment. Acad Med. 2018 Oct;93(10):1584-1590. doi: 10.1097/ACM.0000000000002294

9. Bernard AW, Ceccolini G, Feinn R, Rockfeld J, Rosenberg I, Thomas L, Cassese T. Medical students review of formative OSCE scores, checklists, and videos improves with student-faculty debriefing meetings. Med Educ Online. 2017;22(1):1324718. doi: 10.1080/10872981.2017.1324718

10. Rosenberg I, Thomas L, Ceccolini G, Feinn R. \’Early identification of struggling pre-clerkship learners using formative clinical skills OSCEs: an assessment for learning program.\’. Med Educ Online. 2022 Dec;27(1):2028333. doi: 10.1080/10872981.2022.2028333

11. Heal C, D\’Souza K, Hall L, Smith J, Jones K; ACCLAiM collaboration. Changes to objective structured clinical examinations (OSCE) at Australian medical schools in response to the COVID-19 pandemic. Med Teach. 2021 Nov 11:1-7. doi: 10.1080/0142159X.2021.1998404

12. Misra S, Iobst WF, Hauer KE, Holmboe ES. The Importance of Competency-Based Programmatic Assessment in Graduate Medical Education. J Grad Med Educ. 2021 Apr;13(2 Suppl):113-119. doi: 10.4300/JGME-D-20-00856.1

13. Ross S, Hauer KE, Wycliffe-Jones K, Hall AK, Molgaard L, Richardson D, Oswald A, Bhanji F; ICBME Collaborators. Key considerations in planning and designing programmatic assessment in competency-based medical education. Med Teach. 2021 Jul;43(7):758-764. doi: 10.1080/0142159X.2021.1925099

14. Hall AK, Schumacher DJ, Thoma B, Caretta-Weyer H, Kinnear B, Gruppen L, Cooke LJ, Frank JR, Van Melle E; ICBME Collaborators. Outcomes of competency-based medical education: A taxonomy for shared language. Med Teach. 2021 Jul;43(7):788-793. doi: 10.1080/0142159X.2021.1925643

 

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our \’About\’ page

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.