#CBME: Key Principles in Programmatic Assessment? Part 1

SHARE:
POSTED BY:

(This is a perspective of the current assessment literature from an very early career Clinician Educator – medical student – as part of her literature review for a medical education elective.

 -Jonathan (@sherbino))


By Larissa Hatin

Residency programs are poised to enter a new era that places an emphasis on learner abilities (i.e. outcomes) with a de-emphasis on time based training, and, lastly, promotion of learner centeredness1. There are four cardinal features of competency-based medical education (CBME): a competency framework, staged progression, tailored learning and programmatic assessment (PA). The central question is: what abilities must graduates acquire and demonstrate to enter unsupervised practice? Committees of 9.2.3-cbd-milestones-epas-eexperts have agreed upon various combinations of entrustable professional activities (EPAs), specific to each discipline, that incorporate multiple milestones from across the CanMEDS roles2. These EPAs are sentinel clinical activities (i.e. work) that residents must demonstrate competence in for graduation to unsupervised practice.

Personally, as a medical student on the verge of entering residency, what attracts me to CBME is the concept of programmatic assessment. At the core of programmatic assessment is a focus on assessment for learning rather than a strict focus on assessment of learning. The emphasis is on how assessments can be intertwined in the learning process to promote directed feedback and facilitate a personalized learning plan3.

Van der Vleuten et al4 define three fundamental purposes that should inform an assessment program. The first is that it maximally facilitates learning. Tests that merely rank students and give a grade average no longer suffice. Rather each assessment should further the student’s understanding of the concept and give them direction as to where they need to focus their attention.

Secondly, a program should maximize the robustness of high-stake decisions. Taken more simply, this means that important decisions should not be made on the basis of a single data point or single assessment. Rather a wealth of information, both quantitative and qualitative, should be collected and reviewed by an expert group (that includes peer representation), before a well-informed decision is made5.

Lastly, the program should provide information for improving instruction and the curriculum as whole. Feedback should not be reserved solely for learners! The collected information should also serve to inform educators and faculty as to how well the program is performing, what components are effective and which need improvement.

The question becomes, how can we design an assessment program to achieve these goals? How do we make the transition from theory to practice? I think there are key features that make for effective programmatic assessment.6,7,8   Let’s use the the McMaster Emergency Medicine Residency Assessment Program (McMAP) as an example. 9

Key Principle # 1: Continuous and frequent assessments

This seems obvious and thus hardly worth mentioning. Of course having increased assessments in a longitudinal manner about residents’ performance results in a more comprehensive and accurate global assessment. In the McMAP system every shift a resident is observed by a senior physician who rates their performance on a specific, defined task. These individual assessments are compiled into a summative portfolio for each resident that provides both qualitative and quantitation information used for resident performance reviews by the competence committee.

Key Principle #2: Work-based assessments

Learners need to be directly assessed, in an authentic environment, performing a task. By directly observing a skill, whether it is history taking, charting, or the physical exam, feedback can be given in real time. To complete a McMAP assessment the faculty is required to directly observe the skill they are commenting on. This allows for technical skills to be immediately corrected, clinical pearls given in context and for the learner to leave the encounter feeling better prepared to subsequently perform it.

 Key Principle # 3: Use of quality assessment tools

There is no single assessment tool that is perfect in all areas. However, by using multiple tools and multiple observations competence in a particular domain can be assessed. Essentially, programs need to combine information from various assessments to achieve a more effective overall picture of competency.

To be continued…Part 2 appears on Tuesday May 9!

 

References

  1. Frank JR, Snell LS, Cate TO, Holmboe ES, Carraccio C, Swing SR et al. (2010). Competency-based medical education: theory to practice. Med Teach 32: 638-645.
  2. Royal College of Physicians and Surgeons of Ontario. (2017). About Competence by Design. Retrieved from: http://www.royalcollege.ca/rcsite/cbd/competence-by-design-cbd-e
  3. Schuwirth LW, Van der Vleuten CP. (2011). Programmatic assessment: From assessment of learning to assessment for learning. Med Teach 33: 478-485.
  4. Van der Vleuten CP, Schuwirth LW, Driessen EW, Dijkstra J, Tigelaar D, Baartman LK, van Tartwijk J. (2012). A model for programmatic assessment fit for purpose. Med Teach 34: 205-214.
  5. Dijkstra J, Van der Vleuten CP, Schuwirth LW. (2010). A new framework for designing programmes of assessment. Adv Health Sci Educ Theory Pract 15: 379-393.
  6. Holmboe ES, Sherbino J., Long DM, Swing SR, Frank JR. (2010). The role of assessment in competency-based medical education. Med Teach 32: 676-682.
  7. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, et al. (2011). Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach 33: 206-214.
  8. Van der Vleuten CP, Schuwirth LW, Driessen EW, Govaerts MJ, Heeneman S. (2015). Twelve tips for programmatic assessment. Med Teach 37: 641-646.
  9. Chan T, Sherbino J, McMAP collaborators. (2015). The McMaster Modular Assessment Program (McMAP): a theoretically grounded work-based assessment system for an Emergency Medicine residency program. Acad Med 90: 900-905.

 

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.