#KeyLIMEPodcast 71: Using TRUST as a scale for assessment?

SHARE:
POSTED BY:

In her ICE blog post, Jen Kogan argues that direct observation assessments are critical in CBME.  She then goes further to suggest that the framework that defines competence should be broadened to include patient outcomes – i.e. patient-centred assessments.

A previous KeyLIME podcast  indicates that the frame of reference used by raters is internally idiosyncratic, meaning that each rater personally calibrates their scores in a non-standard fashion.

(In a future post, I will argue that a portion of rater variance is necessary; it reflects different perspectives on trainee performance and contributes to a collective articulation of a trainee’s ability.) However, a significant portion of rater variance (due to differences in interpreting direct observation scales) leads to poor reliability  (and hence threats to validity) of a direct observation assessment.

In this KeyLIME podcast, Linda suggests that adjusting a direct observation scale to include “trust” (e.g. what degree of supervision is required) may improve reliability. Using trust to anchor an assessment scale, may improve interrater reliability in using (interpreting) the scale

– Jonathan (@sherbino)


 

KeyLIME Session 71 – Article under review:

Listen to the podcast

View/download the abstract here.

Warm EJ, Mathis BR, Held JD, Pai S, Tolentino J, Ashbrook L, Lee CK, Lee D, Wood S, Fichtenbaum CJ, Schauer D, Munyon R, Mueller C. Entrustment and mapping of observable practice activities for resident assessment, Journal of General Internal Medicine (Aug 2014), 29 (8): 1177-82

Reviewer
Linda Snell (@LindaSMedEd)

keylime_live_button_e

Background

Competency-based medical education (CBME) organizes the education around competencies, emphasizes performance outcomes, promises greater accountability to patients and society, and is flexible and learner-centered. Competencies are multifaceted and integrated.

Assessment of competence should focus on performance in the workplace, inferring competence based on learner performance. Entrustable Professional Activities (EPAs) are attractive as they focus on day-to-day activities and address multiple competencies.

Entrustable Professional Activities (EPAs) and milestones reduce large competencies into smaller potentially evaluable parts. Yet some EPAs or milestones may still be too broad to use as direct assessment tools. EPAs focus on daily activities vs. directly on competencies. The authors define EPAs as ‘broad activities of practice that the public entrusts all physicians with being capable of performing’ yet they describe EPAs as similar to nesting dolls.  These mare considered by some to still be too large to use as assessment tools directly. Some have suggested mapping EPAs to competencies and milestones to measure developmental progression.

Purpose
The authors describe an assessment system which uses OPAs (observable practice activities – a collection of learning objectives/ activities that must be observed in daily practice in order to form entrustment decisions) over time, and mapping of these entrustment decisions to milestones and to EPAs to measure developmental progression

Type of paper
Description of an innovation
Program evaluation

Key Points on the Methods

Program description – large university-associated urban IM residency in the USA. The program re-wrote the entire curriculum using OPAs as the basic unit – long and iterative process: very labor intensive. 2 kinds of OPA: ‘process OPA’ & ‘content OPA’. COPAs were rotation-based, POPAs crossed rotations. Two levels – ‘junior’ (R1) and ‘senior’ (R2). Now 350 rotation-specific OPAs across 75 rotations. OPAs mapped to milestones and EPAs for IM.

Residents rated using a 5-point entrustment scale:
1. Resident not trusted to perform activity even with supervision
2. Resident trusted to perform activity with direct supervision
3. Resident trusted to perform activity with indirect supervision
4. Resident trusted to perform activity independently
5. Resident trusted to perform activity at aspirational level
6. Not observed (no score)

Extensive faculty development needed.

Resident end-of-rotation assessments (OPA ratings) ‘automatically assigned’ to the mapped milestone or EPA.

Key Outcomes
Program evaluation – huge number of data points for each resident (x=750-1700). Longitudinal assessment curves generated for POPAs for each resident with peer comparison; used for summative and formative purposes.
Authors looked at:
-were correct OPAs chosen to assess: yes as major faculty engagement, in evolution.
-was mapping correct – ??
-is there a gold standard for rate of entrustment progression? none known, so they did not promote residents who attained entrustment faster than peers any earlier, nor hold residents back who were slower
-difference between scores: communications skills and professionalism scores progressed fastest to each entrustment level, with the other competencies progressing more slowly.
Not clear what model, if any, of PE is being

Key Conclusions

The authors conclude that both direct assessment and demonstration of progressive entrustment of trainee skill over time are important goals for all training programs. Systems that incorporate OPAs mapped to milestones and EPAs over time provide the opportunity for achieving both, but require validation

Spare Keys – other take home points for clinician educators
The concept of using progressive entrustment as a means of assessment is a good idea – but the implementation using OPAs is challenging.

Unanswered questions about OPAs and EPAs: (see also accompanying editorial)
-Should all be weighted equally?
-What if entrustment occurs, but is subsequently followed by error leading to lack of entrustment? How contextual is it?
-How many points of supervision are needed to form an entrustment decision?
-Does the competence of the faculty members factor in?

Access KeyLIME podcast archives here

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.