Orthopaedic Resident Assessment: Measuring Skills in Both Knowledge and Technical Skills




© Springer International Publishing AG 2018
Paul J. Dougherty and Barbara L. Joyce (eds.)The Orthopedic Educatorhttps://doi.org/10.1007/978-3-319-62944-5_7


7. Orthopaedic Resident Assessment: Measuring Skills in Both Knowledge and Technical Skills



S. Elizabeth Ames1, 2  , Nathaniel Nelms  and Donna Phillips 


(1)
Associate Professor and Program Director Department of Orthopaedics and Rehabilitation, The Robert Larner, MD College of Medicine, University of Vermont, 95 Carrigan Drive, Robert T. Stafford Hall, 4th Floor, Burlington, VT 05405, USA

(2)
Assistant Professor Department of Orthopaedics and Rehabilitation, The Robert Larner MD College of Medicine University of Vermont, 95 Carrigan Drive, Burlington, VT 05405, USA

(3)
Clinical Professor Orthopaedic Surgery and Pediatrics, Chief Pediatric Orthopedic Surgery Bellevue Hospital, Bellevue Hospital, 462 1st Avenue CD489, New York, NY 10016, USA

 



 

S. Elizabeth Ames (Corresponding author)



 

Nathaniel Nelms



 

Donna Phillips



Keywords
Orthopaedic residentCompetency-based educationTechnical skill trainingMedical knowledgeMilestonesIn-training examinationBoard examinationFeedbackEntrustable professional activitiesClinical assessment



Introduction


Graduate medical education has traditionally been steeped in experiential, workplace-based learning. Residents are taught and supervised by faculty with different levels of expertise, and frequently only work with senior residents for short periods of time. It has always been true that faculty must be able to model, teach, assess, and remediate orthopaedic residents and generally we have done this well. Chapter 3 outlines the shift to competency-based education; this is familiar ground as we have always been about creating and certifying competence in surgical trainees. In terms of assessment, though, this is a whole new world.

Most of us have been trained in an environment of summative assessment—pass/fail decisions to determine whether a minimum criterion has been achieved. Educational standards now require a transformational shift in curriculum and assessment measures. The evolution today is towards formative education as we transition to competency assessment. This evolution is associated with a need to respond to broad national educational goals and accreditation requirements, as well as continuing to meet evolving specialty content. Ultimately we need to develop a solid program of tools and educational structures designed to assess and inform the learner as their skills develop with the ultimate goal of both knowledge and technical improvement.

Our challenge is to adjust to all of these changes without having had exposure to them in our own education. Meaningful training in resident assessment must meet the new metrics. The goal is defined—measuring the educational level of the learner, providing actionable feedback, and ultimately ensuring the learner transitions from novice to “competent.” The challenge to orthopaedic surgery programs is understanding which type of assessments are needed, establishing a real and measurable definition of what competence is for our specialty, and choosing the right tools to assess it. The purpose of this chapter is to consider the history of resident assessment, to present the state of the art of what is available currently, and create a resource that encourages faculty to look forward into how things can be done differently as orthopaedic training continues to evolve.


Resident Assessment Overview: Competency-Based Assessment


All assessment tools and systems represent some form of compromise. The choice (and use) of any system will vary depending on a program’s needs and staff. Assessment systems can and should include multiple measures or tools that capture faculty opinions and provide feedback to the residents. The choice of type and number of tools within a resident assessment system are completely up to the individual program with two exceptions. The ACGME’s Next Accreditation System mandates reporting milestone outcomes and individual case volumes through the case log system. A review of these two systems and suggestions for other assessments follows.

Milestones are an example of a reporting system that requires direct assessment of residents. Orthopaedics was one of the first specialties involved in the creation of milestones, and many educational leaders within the specialty contributed to the effort. Our milestones are known for being particularly numerous, but are well focused within each area of the specialty. The tools themselves are relevant and useful. The key for each subspecialty is developing a reliable, consistent, and useful process for implementing the milestones to track and report educational outcomes. One specific challenge is to maintain a consistent reporting system in a busy clinical world. The quality of the assessments will vary if the program, faculty, or residents find the reporting tool overly burdensome or not particularly useful [1]. Techniques to avoid needless burden while maintaining utility are being explored in the UK where a mandatory work-based assessment system exists [2]. The influence of the work environment also needs to be recognized with variation in social factors, competing tasks, and fatigue [3]. Variability is part of any clinical rating scheme, and significant faculty development should be implemented to reduce rater variability with the milestones.

Education of both faculty and residents is required to appropriately measure milestones. Faculty and resident perceptions significantly influence the utility of any assessment system; milestones provide an opportunity to give constructive feedback which may be best delivered during a face-to-face conversation. The goal of milestones is to provide actionable feedback to the resident; this is lost if milestones are assigned without direct discussion behind the assigned score. Comparative data for the first years of milestone implementation is now available from the ACGME (http://​www.​acgme.​org/​What-We-Do/​Accreditation/​Milestones/​Overview) and specialty specific data will be available in 2017.

The other required ACGME report is the evaluation of the operative cases completed by each resident per year. The standards set by the orthopaedic RRC are considered case minimums that describe exposure, and do not necessarily document technical competence. This report can and should be considered a reflection of the educational design of a residency program and ultimately allows comparison with other programs. Attention to details of the educational process is important to have this data accurately reflect the learning model. Comparative data between programs is available as part of the annual residency review process.

Milestones and case logs demonstrate the benefits of adding formative assessment to the traditional summative model as they record a resident’s progress towards competence. The milestones depend on direct faculty supervision and link technical skill development to episodes of learning. Hence faculty development plays an important role in both these assessments. It is important to note that evaluators are more likely to use the full range of available scores when the scale is referenced to a scale of competence rather than assigning a score [4] referenced only against instructor expectations. Competency-based assessment can be referenced to clear final educational goals. This will have improved validity, particularly when multiple observers use the same scale accompanied by a periodic monitoring, assessment, and committee review process [5]. This is the model behind the Clinical Competency Committee and other ACGME reporting structures.

Finally, residency programs need to honestly assess issues such as a reluctance to fail trainees. There are several strong factors that contribute to this [6]. These factors include lack of confidence among faculty in their ability to detect failure, lack of documentation to support failure, a lack of direct communication with the resident about the issues, and a real or perceived lack of ability to remediate a failure. These issues can be addressed by including more points of assessments during resident training and avoiding reliance on just a few high-stakes evaluations. Work-based assessment is best done on multiple occasions by multiple evaluators. Developing this sort of structure, with points of view from many angles, creates a robust view of trainee progress, and the opportunity to focus on achieving optimal competency in all residents.


Resident Assessment Overview: Testing and Metrics


There are two traditional testing methods within orthopaedic surgery—the Orthopaedic In-Training Examination (OITE) administered by the AAOS and the two-part written and case-based examination administered by the ABOS. The ABOS and ACGME/RRC for Orthopaedic Surgery also mandated the introduction of simulation training for the PGY-1 in 2013. Fully developed modules are available on the ABOS website and will not be reviewed here.

Objective assessment is a challenge. Standardized written examinations continue to have value for both individual resident assessment and programmatic design. The OITE is a multiple-choice examination of approximately 275 questions across multiple subject areas including orthopaedic basic science. It is now administered to over 4000 orthopaedic residents worldwide, and was the first in-training examination created by any specialty. It is not administered in a standardized way nor required of residencies and, therefore, is not directly related to subsequent certification testing. It is widely used, however, and as it has developed the reporting has become much more robust with stratification by program type (allopathic or osteopathic) and year in training. It does allow the individual resident to compare his or her progress relative to their peers in the same year of training, and a program to track overall performance by subject content. It also has some utility as an approximation for relative performance on future certification exams.

There is literature that explores both the correlation between the OITE and other exams and the methods that current orthopaedic residents use to prepare. In 2013 Evaniew et al. surveyed both program directors and residents in North America [7]. Focus on preparation, importance cited by the program, and reported hours spent preparing were predictive of higher OITE scores in comparison to others in the same year in training. References cited for test questions primarily come from journal articles (74%) or textbooks (26%) in recent studies [8, 9]. This may be shifting with the development of electronic resources [10]. Individual program standards and approaches to this test vary.

Studies have explored the relationship between OITE scores and ABOS Part 1 pass rates [1113]. The exams are designed, administered, and scored differently and ABOS is a pass/fail exam but percentiles are reported to programs. Ponce et al. looked at subsection scores over 10 years at one program and found that the areas with greater content representation did correlate with ABOS performance most notably in PGY3 and PGY4 [14]. They and others have concluded that the OITE can be used to help identify residents at risk for failing ABOS Part 1.

The larger question is how a program should respond to a low OITE score. The authors’ opinion is that every program should decide on an overall target metric relevant to historical performance data, and the standards should be clear to the residents. The one clear point is that poor performance often reflects a struggling resident and early, customized exploration and intervention are useful.


Nonoperative Skill Assessment :


Evaluation of resident skills in the clinic has traditionally consisted of an apprenticeship model; relationships in this model are deep with significant time spent together. Dedicated teaching rounds are not as common in orthopaedic programs today, which is one indication that times are changing. Historically verbal formative feedback was mostly used to correct deficiencies, and formal written evaluation was not very instructive. At best, some verbal formative feedback was provided to the learner and one or two written evaluations were given to the program director during the course of a rotation. Better optimization of the feedback piece is of great importance for implementation of competency-based education.

The current system of medical education can make it difficult to sustain face-to-face interaction. We still spend significant time face to face with a resident during an operative case, but the clinical environment has other challenges. An orthopaedic resident’s ability to diagnose and manage patients in a nonoperative setting such as clinic or hospital rounds is as important as their operative skills but has become difficult to teach and assess. Residents have varying interactions with multiple faculty members often across different medical systems. Time spent being directly observed is limited by clinical demands on both parties. Tools to assess these skills are beginning to develop.

Clinical skills such as history taking, physical exam, and informed consent are best assessed through multiple types of evaluations. These can include oral and written examinations, case-based discussions, work-based assessments, and simulated patient encounters [15, 16]. The use of multiple types of evaluations is of particular value as no single task exists independently in the real world. It is important to avoid deconstructing tasks into a checklist of achievements—a checklist approach does not equate to true clinical competency [17]. These challenges are met by direct observation which is a comfortable assessment tool for surgeons. There are both specific tools and systematic approaches that apply to direct observation.

Several tools are available to structure the evaluation of clinical skills. The goal is to provide real-time feedback to the learner in a constructive way while simultaneously building the groundwork for a summative assessment. Several tools demonstrate validity and are applicable to the evaluation of orthopaedic residents including the Mini-Clinical Evaluation Exercise (Mini-CEX) , Clinical Assessment and Management Examination Outpatient (CAMEO) , and the Ottawa Clinic Assessment Tool (OCAT) . They are also reasonably easy to incorporate into a faculty development program. Each is described and referenced below.

The Mini-CEX is validated and is the most widely used tool for assessment of focused clinical encounters. The evaluation includes four measures: history taking, physical examination, clinical judgment and synthesis, and humanism. Each is scored on a point scale from unsatisfactory to superior with an option for any category to score as “insufficient contact to judge.” The original CEX evaluation was based on in-depth observation by faculty of a resident’s comprehensive single-patient evaluation [18]. The Mini-CEX has been widely adopted by multiple specialties and has demonstrated validity [1822].

The CAMEO is a modified version of the Mini-CEX designed to evaluate residents working specifically in a surgical clinic [23]. A 5-point scale is used to evaluate the following domains: test ordering and understanding, diagnostic acumen, history taking, physical examination, communication skills, and overall performance. Evaluators also note the chief complaint, presumptive diagnosis, and difficulty of the case for each assessment. The CAMEO is recognized as a valid form of resident assessment, and the American Board of Surgery (ABS) requires general surgery residents to have documented evaluations of observed patient encounters with either the Mini-CEX or the CAMEO. Both of these tools are available on the ABS website (http://​www.​absurgery.​org/​default.​jsp?​certgsqe_​resassess).

The Ottawa Clinic Assessment Tool (OCAT) is similar in format to the CAMEO, but has a few important features that make it an attractive option for orthopaedics [4]. The OCAT is based on an entire day of clinic instead of a single observed encounter, and was specifically designed for a busy surgical clinic where a faculty member may observe a resident in several domains, without necessarily directly observing an entire patient encounter. There are nine areas of global assessment, a procedural skill question, a yes/no professionalism question, a yes/no for the ability to manage a clinic independently at a generalist level, and two open-ended questions regarding specific examples of something that went well and something that could use improvement. The areas of global assessment include history, physical exam, case presentation, differential diagnosis, management plan, communication skills, documentation, collaboration, and time management. Each of these areas is graded on a 1–5 scale relative to the goal of readiness for independent practice.

The basic premise behind most of these modern evaluation tools is the concept of Entrustable Professional Activities described by Ten Cate in 2005 and reviewed recently [24]. This idea allows competencies to be operationalized and measured in a work environment, and reflects the need to make a judgement about a resident’s independence that is inherent in orthopaedic education. Assessment systems referenced to a level of competence that allows entrustable activities to occur have distinct relevance to orthopaedic education as long as reference points are relatively clear. On an entrustment scale like the OCAT, a resident can be assessed as able to make management decisions with some staff direction (3/5). A score of 5/5 represents that the supervision from the attending was not required. This offers advantages over the more traditional poor to excellent numerical scales which can be interpreted and used differently by different raters—for example, one rater might compare a resident to the “usual resident at that training level,” another to current peers, and another to what is expected for graduation. The ultimate goal of a competency-based assessment system is to reflect progressive independence as defined by the vast majority of faculty.


Direct Observation



The ACGME requires that residency programs teach and assess trainees in six core competencies: medical knowledge, patient care, interpersonal and communication skills, professionalism, systems-based practice, and practice-based learning and improvement. It is challenging to provide specific feedback to trainees with respect to these, and the last four are difficult to assess without direct observation. It is also true that the nontechnical competencies are crucial to providing patient-centered care (PCC) which is dependent on excellent communications skills and professionalism [25, 26]. The American Academy of Orthopaedic Surgeons (AAOS) developed a communications skills workshop focused on orthopaedic-specific scenarios [27, 28]. Unfortunately, it cannot be assumed that teaching and modelling communication skills and professionalism in a classroom results in incorporation of those skills into actual practice. Implementation of a direct observation program has significant advantages in these areas.

Direct observation programs require tools. A systematic approach with a checklist individualized to a residency structure currently exists in orthopaedic and other training programs. A user-friendly broad electronic platform assists with reporting to both residents and administration. Programmatic structure and faculty skill development are required as with any assessment system. The scope of skills to be assessed is broad—options include elements of a clinical history, physical exam, and review of data such as radiographs. The scope can be tailored to the specific rotation or environment which improves efficiency and relevance. System-based tasks such as discussion of surgical risks, indications, and patient safety awareness such as handwashing before and after contact with the patient can be added. Overall impressions and real-time comments can both be incorporated. Trainee self-assessment prior to reviewing the faculty evaluation may also be included.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 10, 2018 | Posted by in ORTHOPEDIC | Comments Off on Orthopaedic Resident Assessment: Measuring Skills in Both Knowledge and Technical Skills

Full access? Get Clinical Tree

Get Clinical Tree app for offline access