The authors of Linda’s selected article hypothesize that residents’ performance on a situational judgment test (SJT) can predict current and future performance on two criteria: ACGME competencies and the MPA (multisource professionalism assessment). Did their predictions prove to be true?
———————————————————————–
KEYLIME SESSION 329
Reference
Cullen MJ, C Zhang, B Marcus-Blank, JP Braman, E Tiryaki, M Konia, MA Hunt , MS Lee , Ann Van Heest, R Englander, PR Sackett, JS Andrews. 2020. Improving Our Ability to Predict Resident Applicant Performance: Validity Evidence for a Situational Judgment Test. Teaching and Learning in Medicine. 32:5, 508-521.
Reviewer
Linda Snell (@LindaSMedEd)
Background
Professionalism, interpersonal and communication skills in trainees and practitioners is associated with important patient outcomes, hence their inclusion in most competency frameworks. Wouldn’t it be nice to be able to measure these at entry and see if they predict future performance? These ‘noncognitive’ competencies (author’s term…I hate it! ) are often not measured in selection processes for med school and residency. MMIs and structured interviews show some ‘predictive promise’.
A promising method for measuring these competencies is the situational judgment test – SJT, where respondents are presented with written or video-based scenarios and asked to make choices from a set of alternative courses of action. Interpersonally oriented SJTs are commonly used for selection in Europe, Singapore, Canada, and Australia, with evidence suggesting they predict various performance ratings and short- and long-term outcomes in physicians (in-training performance, end-of-training performance, supervisory ratings of performance, licensing OSCEs). However their use in residency settings in the US has been infrequently investigated.
Purpose
To investigate if residents’ performance on a SJT (designed to measure professionalism-related competencies e.g. conscientiousness, integrity, accountability, aspiring to excellence, teamwork, stress tolerance, and patient-centered care) predicts current and future performance on two conceptually distinct criteria: ACGME competencies and the MPA (multisource professionalism assessment).
- Hypothesis 1: SJT predict performance on the noncognitive ACGME competencies (communication, professionalism, PBLI) as well as the MPA;
- Hypothesis 2: USMLE scores, as cognitively oriented measures of medical knowledge, would predict performance on the ACGME medical knowledge competency;
- Hypothesis 3: SJT would add incremental validity (above and beyond that provided by USMLE scores) to the prediction of performance on the three noncognitive competencies, patient care and MPA
Key Points on the Methods
Population: 21 volunteer residency/fellowship programs – primarily 1 institution
Developed a model for SJT: literature content analysis, sorting behaviors into dimensions* and validating these, developing critical incidents that reflect dimensions and choosing best by consensus, developing scenarios to elaborate critical incidents based on specific criteria,. Responses to scenarios and ‘benchmarking’ of these developed by consensus with PDs, attendings, residents and experts. Scoring was based on distance from benchmark. Final group of 38 scenarios piloted and 15 best chosen.
*conscientiousness, aspiring to excellence, integrity, accountability, teamwork, patient-centered care, stress tolerance.
Respondents viewed the scenarios, were asked “What should I do?”, rated the effectiveness of each of a number of responses on a scale highly ineffective to highly effective.
USMLE Step 1, Step 2 CK, and Step 3 scores, ACGME milestone performance and Multisource professionalism assessment (developed by authors – no comments on validation) obtained.
After various manipulations to reduce interprogram biases, performed correlations and regression analyses between above and SJT.
Key Outcomes
SJT predicted overall ACGME milestone performance and MPA performance; ACGME patient care, systems-based practice, practice-based learning and improvement, interpersonal and communication skills, and professionalism competencies 1 year later; and contributed incremental validity over USMLE scores in predicting overall ACGME milestone performance 1 year later and MPA performance 3 months later.
USMLE Step 1, Step 2CK, and Step 3 scores were not useful predictors of performance on ACGME competency domains.
Many individual competency domains (e.g., PC, MK, SBP, PBLI, PROF, ICS) are highly correlated with one another – ? halo effect,
Key Conclusions
First study to correlate (‘predict’) milestone performance; adds to evidence importance of measuring all competencies in residency program applicants; suggests that SJTs show promise as a method for assessing attributes in residency program applicants that may not be seen in other measures.
The authors conclude that SJT’s incremental validity to the USMLE series in this study underscores the importance of moving beyond these standardized tests to a more holistic review of candidates that includes both cognitive and noncognitive measures.
Spare Keys – other take home points for clinician educators
First steps, tool need validating and used more widely to see if results generalizable. Interesting to compare with clinically relevant outcomes not only other assessments (e.g. prescribing, patient outcomes…)
Access KeyLIME podcast archives here
The views and opinions expressed in this post and podcast episode are those of the host(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page