#KeyLIME Podcast 221: Not just a report card … or is it?

SHARE:
POSTED BY:

Linda’s choice this week is a study from JGME that examined how residents receive  practice feedback The authors report on how it improved their performance and  whether it had any effect on their patient outcomes. Listen here to hear the hosts discuss. 

————————————————————————–

KeyLIME Session 218

Listen to the podcast.

Reference

New KeyLIME Podcast Episode Image

Haynes et al., Continuity Clinic Practice Feedback Curriculum for Residents: A Model for Ambulatory Education. Journal of Graduate Medical Education, April 2019

Reviewer

Linda Snell (@LindaSMedEd)

Background

These days we are expected to evaluate our practice to improve our clinical performance [process] and – we hope – clinical outcomes. All part of the CanMEDS Leader role or ACGME Practice-Based learning and Improvement core competency. This is not just a case of providing data to practitioners or trainees, they must be able to interpret it, and use it to change/improve practice.

Five elements of PBLI: responsibility for a panel of patients, auditing that panel based on evidence-based  criteria, comparing the audit to benchmarks to explore potential deficiencies (and successes), identifying areas for change, and engaging in a quality improvement intervention.

Until the past few years these types of data had to be obtained manually (chart review, etc) which was unwieldy, time consuming, logistically challenging.  Electronic medical records (EMRs) have changed all this – we now have access to detailed, extensive, frequent and personalized data… the report card.

Providing data (‘practice feedback’) in association with activities to help use the data such as education sessions, self reflection and QI activities has had more success in improving process and outcome measures.

The few studies looking at feeding back data to residents have been done mainly on inpatients, which may provide data on team vs individual performance, where outcomes can rarely be ascribed to one person. (e.g. length of stay). Although there is a need for ‘attribution’ (who saw the patient), doing similar activities in a longitudinal clinic setting can give individualized feedback to residents.

Purpose

The authors aimed to ‘use a structured framework and individualized EMR-level data to guide how residents receive practice feedback,  interpret data on their patient, panels, engage them in quality improvement efforts, and prepare them for practice.’

Key Points on the Methods

Description of an innovation with a program evaluation.

Education intervention over 1 year, 144 Internal Medicine residents (unknown PGY level)  in 4 continuity clinic sites (hospital x2, VA, community). See table

  • Attribution – identified resident’s panel of patients
  • Metrics: BP control and colorectal cancer screening – these chosen as residents feel they can impact, have a large denominator, can relate to disease-based teaching, and align with institutional QI activities and goals
  • Faculty coaching to help residents with data accuracy, understanding, opportunities for change, possible interventions. All mentors received faculty development.
  • Comparisons – peer and cross site, with coaching on data validity and why outliers. Small group discussions and peer teaching aided this.
  • QI focus – self assessment worksheet to ‘reflect on performance, set personal goals, and identify individual-level and systems-level interventions to improve

Program evaluation:

Learner outcomes (subjective): non-validated pre-post survey. Self-report how frequently they engaged in practice feedback,  whether they thought reviewing data was useful in improving quality of care.

Learner outcomes (objective): frequency of log-in to view data, % who attended educational activities, and self-assessment completion rates;  time spent by residents, faculty, and coordinators.

Patient outcomes: BP control and CRC screening rates for 3 of 4 clinics was tracked using run charts.

Key Outcomes

Learner outcomes (survey): 88% completed pre, 59% completed post.

Unsurprising improvement in

  • self reported ability to receive, interpret, understand practice data,
  • apply data (e.g. by identifying opportunities for change, adjusting workflow or clinic processes.perceptions of utility and impact

Learner outcomes (performance): 90%  attended educational activities, and 100% self-assessment completion and logged in at least once; log-ins increased over year in parallel with education activities.

No significant additional learner or instructor time and high levels of resident and faculty enthusiasm and acceptability.

Patient outcomes: BP control and CRC screening rates remained stable over year, but a ‘nonrandom variation in the form of a shift in the data toward higher colorectal cancer screening rates later in the year and in the first few months of postintervention follow-up’.

In sum, residents reported significant improvements in their ability to receive, interpret, and understand practice feedback. They logged in to access their data more frequently and had high levels of participation in curricular activities. Patient outcomes for the chosen metrics did not change.

Key Conclusions

The authors conclude ‘this is the first described longitudinal residency curriculum to use a structured framework and individualized EMR-level data to guide how residents receive practice feedback … and helped residents  develop PBLI competencies and identify both individual and large-scale opportunities for quality improvement.’

Spare Keys – other take home points for clinician educators

This is a description of an innovation with an associated program evaluation. The curriculum is innovative, builds on the literature, is multifaceted and uses EMR data.

I would have liked more details about the residents and clinics, and to see more objective findings (you have the EMR, use it), how the residents changed their practice, and patient outcomes over a longer time.

Access KeyLIME podcast archives here

Related Posts

Be the First to Know
As soon as a new article is published, let us email you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Topics

Subscribe to our Newsletter

We post three times a week – Mondays, Wednesdays and Fridays! Sign up to our newsletter to receive a bi monthly digest of our posts.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.