Alpha Omega Alpha Honor Medical Society

Medical Professionalism: Best Practices

Previous section Table of Contents Next section
 

REMEDIATION

Chapter 8. Clinical Skills Remediation: Strategy for Intervention of Professionalism Lapses

Anna Chang, MD

This chapter brings the literature and practice of clinical skills guidance and remediation in medical education to the discussion of best practices in medical professionalism. First, it describes differences, and then similarities, between the approaches to low performance in clinical skills and professionalism. Next, it examines the applicability of a five-step clinical skills remediation and guidance strategy to address professionalism lapses. Finally, it suggests individual and systems approaches to the remediation of learners and colleagues who need guidance in medical professionalism.

Case example from clinical skills

Dear Student,
We regret to inform you that you have failed your clinical skills examination in the areas of history-taking, physical examination, clinical reasoning, and patient communication skills. Your performance is in the lowest 2% of the class and your score does not meet the minimum threshold for passing. You are now required to meet with the course director for remediation . . .

Letters like this notify some medical students each year of unexpected failing performance on clinical skills examinations. At one medical school, this notice at the end of a foundational clinical skills course informs a handful of second-year students about performance that is below expected competence on a multi-station standardized patient objective structured clinical examination (OSCE) final examination. Since the 1990s, most U.S. medical schools have required student participation in standardized patient clinical skills examinations, with a median annual cost of $50,000 per examination in 2005.1 For most medical students, examinations like this are among the first in a series of high-stakes clinical skills examinations to ensure that they achieve minimum expected competence in the clinical skills required to advance to medical school graduation, residency entry, and board certification.1

Studies have reported strategies to guide the steps following a student’s failure to progress to the next stage of training because of below-expected competence performance compared with milestones in the competency domain of patient care.2 In recent years, scholarship in this area has examined various aspects of guidance and remediation in medical education, from tools for early identification of struggling learners, to the effect of performance data on learning goals, to systematic reviews of remediation processes among U.S. medical schools.3–6 Thus, the field of clinical skills guidance and remediation is on the path of building an evidence base of best practices to guide educators and institutions.

Clinical skills versus professionalism: Differences

Many educators would likely point out some important differences between the approach to assessment and remediation of clinical skills and the approach applied to lapses in medical professionalism.

First, structured checklists along with global rating scales are now routinely used in assessment and standard-setting of formative and summative clinical skills examinations.1,7 Faculty members or standardized patients complete checklists after simulated clinical encounters.7,8 Passing performance can be determined by criterion-referenced or normative standard setting methods.1,8 In other examinations, checklists of key history or physical examination items are applied to the written post-encounter clinical note to assess the learners’ clinical reasoning.8 These real or standardized patient examinations with the use of checklists occur multiple times throughout medical school, and students are no longer allowed to advance to licensure without demonstrating clinical skills competence.9 Similar assessment systems may be less consistently applied to the competency domain of professionalism.

Second, identification and remediation of deficits in the technical aspects of clinical skills may be perceived as less emotional, and therefore easier, for both faculty and learners than lapses in professionalism. Faculty members find it challenging to fail learners.10 “Millennial generation” learners thrive with positive feedback.11 One study shows that students are more likely to give constructive feedback about technical deficits (e.g., physical examination technique) when randomly grouped with peers.12 Only after longitudinal peer cohorts have spent years together do students begin to develop the trust and comfort that allows them to give constructive feedback about more personal and interpersonal learning needs (e.g., communication skills).12 Thus, it is possible that low performers in general clinical skills may be identified with more ease than those who demonstrate professionalism lapses.

Third, most schools have additional guidance programs for clinical skills deficits, as well as processes to measure improvement after remediation.2,6 Structured remediation programs for the more technical skills of medicine, such as key history items or physical examination technique, are common in medical schools today.2,6,13 On the other hand, similar pre-existing systems to support learners with professionalism lapses are rarely reported in the literature, may not be as prevalent in practice, and may develop on an ad hoc basis in response to individual issues.14 Faculty members hesitate to point out learner performance deficits for multiple reasons, particularly if they are uncertain about the availability of remediation options.10 The lack of remediation programs in professionalism may affect the identification of those with professionalism lapses. Furthermore, most medical schools repeat the clinical skills examination after remediation programs, and almost all schools report having a process to reassess clinical skills competence.6 Thus, one approach to begin to close the gap of differences is to develop similarly robust identification, remediation, and reassessment processes for learners and colleagues in the domain of medical professionalism.

Clinical skills and professionalism: Similarities

There are also notable similarities between the principles and steps in working with those with additional learning needs in clinical skills and in medical professionalism.

First, learners demonstrate their abilities in clinical skills or professionalism, as well as in other competency domains, in overlapping and integrated ways while participating in many of the same activities in the classroom environment and in the clinical setting.15 The movement towards the use of entrustable professional activities as an educational assessment framework advocates for the unit of measurement of physician skills to be an integrated activity, rather than deconstructed competencies.16 Viewed through this lens, skills in history taking, physical examination, communication, and professionalism are interrelated elements of a single connected whole.

Second, the strategy for remediation in any competency domain begins with identification of those who are performing poorly compared to performance standards using objective measurement tools.2 The low performer then receives performance data and feedback, as well as guidance to develop effective learning plans that target the deficit.6 The plan is put into action for a period of time, and the learner is then retested by objective measures to determine the outcome of remediation.2 With these steps, the approach to low-performing learners in both clinical skills and professionalism can be remarkably similar.

Finally, competence in general clinical skills and competence in medical professionalism are intertwined and essential to the physician’s role on the clinical team, and the physician’s duty to patients.17,18 For example, communication skills as applied to gathering and sharing information are among the most important clinical skills in the patient encounter, and are simultaneously crucial for aspects of professionalism that involve interactions with the patient. As such, the outcomes of remediation for both have meaning for individual patient care as well as health care systems. Thus, the reasons for, and end result of, remediation for both general clinical skills and medical professionalism have significant impact on outcomes such as patient safety, patient satisfaction, and quality of care.

Why remediate?

But if I accept you as you are, I will make you worse; however if I treat you as though you are what you are capable of becoming, I help you become that.

—Goethe

Despite individual and systems challenges inherent to each step of the process—from identification of low performers, to accurately describing the deficit, to designing a remediation program, to measuring outcomes—there are important reasons to pursue this path for learners in need. Medical educators hold a dual responsibility to their learners and to their learners’ future patients. To fulfill the responsibility to their learners, educators must begin with the belief that each person has the ability to improve his or her performance and has the right to receive feedback and guidance that contribute to continued development as a professional. To fulfill the responsibility to their learners’ future patients, educators need to assess learner performance with objectivity, apply skillful communication with courage to describe any deficits, and commit to participate in remediation whenever appropriate.

The following section describes a step-wise strategy for remediation drawn from lessons learned from clinical skills that can be adapted and applied to work in medical professionalism.

A sample remediation strategy in five steps:

Closing the gap between performance and expectations

This five-step strategy, beginning with identification of the deficit and ending with measurement of outcomes after remediation, can be used to frame the approach to helping learners with lapses in medical professionalism.

Step 1: Early Identification

The first step calls for early identification of learner deficits—a challenge for educators. As noted earlier, faculty can be reluctant to point out trainee problems for a number of reasons, including lack of documentation, lack of knowledge of what to document, the anticipation of a negative experience with an appeals process, and lack of remediation options.10 Furthermore, medical educators are invested in the success of their learners, and cognitive psychologists have demonstrated that commitment to a process (e.g., teaching) can result in a higher likelihood of believing in positive results (i.e., learner competence) even if evidence exists to the contrary.19 This belief may tempt educators to search for, or accept, situational reasons for poor performance from their learners. But to consistently achieve optimal learner performance, it is important to keenly differentiate between one-time contextual events and patterns of repeated low performance that point to a need for additional guidance.

The importance of early identification is confirmed by studies demonstrating that performance deficits, if not identified and addressed, tend to persist. Klamen et al. described a statistically robust correlation between low performance in clinical skills examinations in year two and in year four (OR 20, p=o).4 Chang et al. reported that communication and professionalism deficits reported in core clerkship ratings (OR 1.79, p=0.008) and student progress review meetings (OR 2.64, p=0.002) predict failure in year four clinical skills examinations.3 Studies show that early identification allows learners who need additional guidance the time and opportunity to develop and enact targeted learning plans to improve performance.20

When confronted with data of low performance after high-stakes assessments in the later years of school, students often ask: “Why didn’t you tell me this earlier?” Studies confirm that educators do have data to identify learners with a pattern of professionalism lapses, and that sustained difficulties tend to track over time.3,21 While educators may wish to believe that silence is kinder or allows learners more time to improve on their own, this erroneous assumption can actually hurt both learners and their future patients. Thus, early identification is an important first step.

Step 2: Objective data

This step identifies the gap by using objective measures or measurements to compare the learner’s performance with expected milestones. For medical knowledge and clinical skills, a number of assessment tools are routinely used, including written examinations, oral examinations, simulated and real patient OSCEs, global ratings, direct observation, portfolio, 360° evaluation, etc.22 Fifty-five tools were identified just for direct observation of clinical skills with real patients.23 Approximately eighty to ninety percent of U.S. medical schools conduct simulated clinical skills examinations in years two, three, or four.1 The majority (80%) use standardized patient checklists, and some (21%) use faculty checklists or global assessment scales to score clinical encounters.2 Most (60%) use normative grading strategies, with the rest using criterion-referenced (21%) or a combination (18%).2

While there are flaws and challenges with any single assessment tool, there are important reasons to apply a combination of assessment tools to determine performance in every competency domain.22 First, multiple groups of assessors (e.g., teachers, peers, standardized patients, real patients, clinical staff) can identify in learners similar deficits using different assessment tools at different times.3 Second, even assessment tools designed primarily for measuring performance in one domain (e.g., clinical reasoning in a clinical skills examination) can identify lapses in performance in other domains (e.g., fabrication of clinical findings pointing to lapses in knowledge and professionalism).15,24 To move successfully to the next step in this remediation strategy, it is important that the educator and the learner use the same data to establish agreement about the gap between performance and expectations.

Step 3: Shared understanding

After objective data establishes a gap between performance and expectations, the learner and the educator begin the process of building a shared understanding. Educators begin with the knowledge that learners need data and guidance—not just data alone—to identify their deficits and learning plans.25 This guidance begins with a one-on-one meeting between the student and the educator in which a conversation about performance is built on a foundation of rapport, trust, and support.13 Important techniques include listening, summarizing, responding to emotions, expressing support, and redirecting towards the learning objective. Some sample words to use include:

  • “We are meeting today to discuss your performance in . . .”
  • “What is your interpretation of . . .”
  • “Here are some additional perspectives on . . .”
  • “May we agree to work on improving . . .”

Some educators assume that learners will be able to correctly identify their learning needs and develop corrective plans on their own if given numerical and narrative evidence of low performance and even comparative cohort data. But in one study, only half of all students who failed a high-stakes clinical skills examination in the area of communication skills were able to develop learning goals in that area without faculty guidance, even after receiving individual and comparative examination results indicating failing performance, in both qualitative and quantitative formats.25 This has potential implications for remediation in professionalism lapses. The debate to reach agreement on a shared definition of low performance in communication skills and professionalism skills can become mired in gray areas considered to be subjective, personal, emotional, and challenging. With low performing learners, educators cannot simply deliver performance data and leave learners alone to determine accurate next steps for improvement without guidance.26,27

Step 4: Learning plans

The strategy continues with a focus on two aspects of the learning plan: writing effective learning plans, and putting them into action.

After the educator and the learner have established a shared understanding of the learning gap as well as explored the need to develop targeted learning plans, the learner should be encouraged to initiate the process of drafting learning plans.25 This step is important in reinforcing learner ownership and commitment, as well as demonstrating to educators where learners are starting from in their understanding and synthesis of the information thus far. One common acronym is SMART, indicating that effective learning plans are specific, measurable, attainable, relevant, and time-bound.28 A sample ineffective learning plan might be: “I will read more” or “I will not be late.” More effective learning plans are specific (e.g., “I will practice X skill”), measurable (e.g., “with the next three patients to achieve Y performance level”), and time-bound (e.g., “over the next two weeks). One study demonstrated that ninety-six percent of fourth-year students write specific learning goals with minimal written instructions.25 However, without guidance, learners may not choose to write learning plans that address the most important areas of deficit, or may not know how to develop an effective plan to address target deficits.25

Putting learning plans into action may include sequential or multi-pronged approaches of deliberate practice in different formats and settings. Strategies include meetings between the faculty and the learner for role play and practice, additional or elective clinical experiences in environments that allow skills building, standardized patient cases in simulated clinical skills environments, peer learning, small groups observation and feedback, and others.2,6,27,29–31

Many U.S. medical schools employ group learning activities for deliberate practice in the context of remediation.6 Peer learners, even those who have additional learning needs themselves, are effective teachers and feel safe in small group settings in the context of remediation.29 While educators’ confidence in their own ability to help learners remediate is generally low and is lowest for professionalism (2.96 on a scale of 1 = strongly disagree to 5 = strongly agree), their confidence increases with group practice options for learners.6 In other words, when faculty are able to access group learning activities as a tool for remediation, they feel more confident participating in remediation for learners with lapses in professionalism. Learners also may perceive feedback from peers as being more authentic, less threatening, and more understandable. Observation shows that learners have different strengths and weaknesses, and often complement each other when learning in small groups. Perhaps simply sharing the task of remediation in the form of group activities builds a learning community and decreases the resistance and activation energy needed for identification and remediation.27

Step 5: Measuring outcomes

The final step of this strategy is the measurement of outcomes after remediation. This remains a challenging task in every competency domain. The precise definition of developmentally appropriate performance goals, assessment tools, and standard setting strategies can seem to be elusive moving targets.

Approximately seventy-five percent of schools report retesting after clinical skills remediation with the same or different standardized patient examination cases.6 However, many repeat examinations are less complex or more targeted in an effort to assess for minimum competence. The complexities of different standard-setting strategies likely also affect individual outcomes. With normative standard setting, educators find it challenging to choose the most appropriate cohort for comparison. Since the examination itself is often different from the original, educators are challenged with applying criterion-based scoring strategies to a retest applicable to only a few learners because it can require an intensive time investment from a group of experts to define appropriate cutoff scores.8 And finally, with different competency-based education frameworks, educators debate the use of combinations of frameworks consisting of developmentally progressive milestones, non-overlapping competencies, or integrated entrustable professional activities, or others.32 Thus, while somewhat cleaner measurement tools exist for clinical skills performance, more science is needed in both clinical skills and professionalism in assessment of remediation outcomes.

Summary: The science, the art, and the unknown of remediation

The science of remediation

The literature of remediation in medical education has been active since 2000, yet the science is still young. In the area of clinical skills remediation, scholarship has included surveys of medical school remediation processes, systematic reviews of remediation programs, books with expert recommendations, studies showing early identification predictors of struggling learners, and limited outcomes of remediation. In terms of timeline, the domain of clinical skills may be somewhat ahead of that on medical professionalism in building a robust body of work on assessment tools and remediation strategies. However, the progress made in defining medical professionalism lays the groundwork for next steps in practice and research, which may include development and validation of assessment tools, studies of remediation strategies, and descriptions of learner and systems outcomes.

The art of remediation

Success factors in the practice of remediation are rooted in the human experience of learning and achievement. Early identification of struggling learners is critical to allow for early intervention, which is often fruitful. The educator and learner begin with a one-on-one meeting to establish trust, safety, and shared goals. They then agree on performance data, performance expectations, and learning plans. One recommendation worth considering is framing the process as guidance for continuous improvement of performance rather than as remediation for failing performance. Educators could describe an invitation-only guidance program aimed at increasing the learner’s future performance. This simple reframing can help learners to begin with an open mind for learning rather than dwelling on blame or shame. A second recommendation is to challenge the learner to actively initiate and own the process of learning. One example is having learners write and revise their own learning plans with faculty guidance along the way. Sometimes educators can be so eager to teach that they take over learner tasks in active learning. Expectations of active learning prevent the occasional surprising discovery of how little might be retained by the passive learner at the end of intense teaching.

The unknown of remediation: Shared challenges between remediation of clinical skills and professionalism lapses

Finally, there are remarkable parallels between the domains of clinical skills and medical professionalism in what remains to be learned in remediating learners or clinicians whose performance is below expected competence. These questions remain:

  • What is the deficit?
  • Do we aim to change the learner’s attitude, or is changing behavior sufficient?
  • What are effective strategies to guide learning in remediation?
  • How do educators reserve time for remediation in the core curriculum?
  • How do we systematically document performance after deliberate practice?
  • What data contribute to reassessment other than absence of negative reports?
  • What if improvement is not consistent across settings or over time?
  • What is the end point of remediation?
    • For example, is it when the learner demonstrates adequate performance once, or more than once? In one context, or in more than one?
    • Is it when educators have built such a robust scaffold around these learners to get them barely over the threshold of competence that it cannot be sustained in a busy health care system to maintain adequate performance?
    • Or is remediation over when we find out that the behavior cannot be changed?

The future of remediation in the field of medicine should include studies of effectiveness of remediation strategies, data on long-term learner and patient outcomes after remediation, and the development of comprehensive systems approaches to professional development that cross silos of competency domains or course structures. In addition, participating in remediation may be an opportunity for learners to gain insight into generalizable ways to improve performance. Ultimately, effective learning programs initially developed for remediation could be expanded to improve everyone’s performance with individual learning plans in all competency domains, and not just those who have already demonstrated lapse or failure. In this way, remediation programs would become one part of a whole system of competency-based learning and assessment in the continuum of lifelong learning, from undergraduate medical education, to graduate medical education, to clinical practice.

Conclusions

Effective practices of guidance and remediation for clinical skills and medical professionalism are important to medical education and clinical medicine. Lessons learned and practical strategies from clinical skills remediation can be adapted and applied to guidance of those with professionalism lapses. Systematic approaches to remediation in the domain of medical professionalism would move the field forward in fulfilling our duty to our colleagues and our patients.

Acknowledgments

Dr. Chang’s work on clinical skills remediation was supported by the University of California San Francisco (UCSF) School of Medicine, the UCSF Haile T. Debas Academy of Medical Educators Innovations Funding Program, and the UCSF Medical Education Research Fellowship. Her work in clinical skills assessment is informed by participation as a member school faculty in the California Consortium for the Assessment of Clinical Competence, and as faculty on the Test Materials Development Committee for the National Board of Medical Examiner’s USMLE Step 2 Clinical Skills Examinations.

References

  1. Hauer KE, Hodgson CS, Kerr KM, et al. A national study of medical student clinical skills assessment. Acad Med 2005; 80 (10 Suppl): S25–29.
  2. Hauer KE, Teherani A, Irby DM, et al. Approaches to medical student remediation after a comprehensive clinical skills examination. Med Educ 2008; 42: 104–12.
  3. Chang A, Boscardin C, Chou CL, et al. Predicting failing performance on a standardized patient clinical performance examination: The importance of communication and professionalism skills deficits. Acad Med 2009; 84 (10 Suppl): S101–104
  4. Klamen DL, Borgia PT. Can students’ scores on preclerkship clinical performance examinations predict that they will fail a senior clinical performance examination? Acad Med 2011; 86: 516–20.
  5. Chang A, Chou CL, Teherani A, Hauer KE. Clinical skills-related learning goals of senior medical students after performance feedback. Med Educ 2011; 45: 878–85.
  6. Saxena V, O’Sullivan PS, Teherani A, et al. Remediation techniques for student performance problems after a comprehensive clinical skills assessment. Acad Med 2009; 84: 669–76.
  7. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA 2002; 287: 226–35.
  8. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet 2001; 357: 945–49.
  9. Papadakis MA. The Step 2 clinical-skills examination. N Engl J Med 2004; 350: 1703–05.
  10. Dudek NL, Marks MB, Regehr G. Failure to fail: The perspectives of clinical supervisors. Acad Med 2005; 80 (10 Suppl): S84–87.
  11. Bing-You RG, Trowbridge RL. Why medical educators may be failing at feedback. JAMA 2009; 302: 1330–31.
  12. Chou CL, Masters DE, Chang A, et al. Effects of longitudinal small-group learning on delivery and receipt of communication skills feedback. Med Educ 2013; 47: 1073–79.
  13. Chang A, Chou CL, Hauer KE. Clinical skills remedial training for medical students. Med Educ 2008; 42: 1118–19.
  14. Hauer KE, Ciccone A, Henzel TR, et al. Remediation of the deficiencies of physicians across the continuum from medical school to practice: A thematic review of the literature. Acad Med 2009; 84: 1822–32.
  15. Teherani A, O’Sullivan P, Lovett M, Hauer KE. Categorization of unprofessional behaviours identified during administration of and remediation after a comprehensive clinical performance examination using a validated professionalism framework. Med Teach 2009; 31: 1007–12.
  16. ten Cate O. Entrustability of professional activities and competency-based training. Med Educ 2005; 39: 1176–77.
  17. Irvine D. The performance of doctors. I: Professionalism and self regulation in a changing world. BMJ 1997; 314: 1540–42.
  18. Irvine D. The performance of doctors. II: Maintaining good practice, protecting patients from poor performance. BMJ 1997; 314: 1613–15.
  19. Kahneman D. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux; 201o.
  20. Klamen DL, Williams RG. The efficacy of a targeted remediation process for students who fail standardized patient examinations. Teach Learn Med 2011; 23: 3–11.
  21. Papadakis MA, Teherani A, Banach MA, et al. Disciplinary action by medical boards and prior behavior in medical school. N Engl J Med 2005; 353: 2673–82.
  22. Epstein RM. Assessment in Medical Education. N Engl J Med 2007; 356: 387–96.
  23. Kogan, JR., Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA 2009; 302: 1316–26.
  24. Friedman MH, Connell KJ, Olthoff AJ, et al. Medical student errors in making a diagnosis. Acad Med 1998; 73 (10 Suupl): S19–21.
  25. Chang A, Chou CL, Teherani A, Hauer KE. Clinical skills-related learning goals of senior medical students after performance feedback. Med Educ 2011; 45: 878–85.
  26. Durning SJ, Cleary TJ, Sandars J, et al. Perspective: Viewing “strugglers” through a different lens: How a self-regulated learning perspective can help medical educators with assessment and remediation. Acad Med 2011; 86: 488–95.
  27. Steinert Y. The “problem” learner: Whose problem is it? AMEE Guide No. 76. Med Teach 2013; 35: e1035–45.
  28. Hamilton M. Putting words in their mouths: The alignment of identities with system goals through the use of Individual Learning Plans. Brit Educ Res J 2009; 35: 221–42.
  29. Chou CL, Chang A, Hauer KE. Remediation workshop for medical students in patient-doctor interaction skills. Med Educ 2008; 42: 537.
  30. Cleland J, Leggett H, Sandars J, et al. The remediation challenge: Theoretical and methodological insights from a systematic review. Med Educ 2013; 47: 242–51.
  31. Kalet A, Chou CL. Remediation in Medical Education: A Mid-Course Correction. New York: Springer; 2013.
  32. Pangaro L, ten Cate O. Frameworks for learner assessment in medicine: AMEE Guide No. 78. Med Teach 2013; 35: e1197–210.

 

Previous section Table of Contents Next section
 

Updated on May 15, 2015.


© 2017 Alpha Omega Alpha Honor Medical Society