Autonomy, the ability to independently take care of patients and perform procedures, has long been an important theme in surgical education. It is an essential element of the transition from supervised training to independent practice. Historically, the surgical apprenticeship model granted trainees a high level of autonomy, and residents functioned with little oversight. Over the last few decades, however, concerns about patient safety and risk management have driven changes in billing regulations and duty-hour restrictions. Consequently, the amount of direct supervision has increased and the perceived level of autonomy has declined for surgical trainees.
Creating innovative ways to cultivate autonomy within these constraints is imperative. This article provides a framework for promoting autonomy in surgical training. This rubric includes structured clinical opportunities, simulation, and feedback, as well as the foundational paradigms of competency-based education and self-directed learning (see Figure 1).
Figure 1. Foundations for resident autonomy
Structured clinical opportunities
Perhaps the most intuitive way to increase autonomy for surgical trainees is to develop a curriculum that includes structured clinical opportunities with increased levels of autonomy. These opportunities can exist both in and outside of the operating room (OR).
Outside the OR, residents must learn to manage patients pre- and postoperatively in both the inpatient and outpatient settings. Structured clinics and resident-run services have proven effective at fostering resident autonomy in a manner that is safe for patients. Wojcik and colleagues reported on a resident-run acute minor surgery clinic in which third-year residents independently assessed patients and performed in-office minor procedures.1 Rates of complications were similar in resident- and attending-run clinics, and residents responded positively to the level of independence that the clinic provided. Similarly, residents had higher reported satisfaction and autonomy on a resident-run acute care surgery service in which senior residents managed both primary and consult patients.2 Not only did the residents evaluate and manage the patients, but they ultimately acted as teaching assistants to junior residents in cases that the service generated, bolstering their operative independence and improving their skills as educators.
Restructuring an outpatient clinic or surgical service may be relatively easy to accomplish, but providing formalized opportunities in the OR for increased resident autonomy is more difficult because of hospital policies and diversity in case type, volume, and complexity.3 However, carefully implemented pilots have proven successful. Wojcik and colleagues published their institutional experience with the development of a rotation with embedded structured operative autonomy for general surgery cases. During a one-month rotation, chief residents began cases alone, with the assistance of a junior resident, using their own operative block time. The attending supervised the critical portion of the operation.4 Patient outcomes after appendectomy and cholecystectomy, which were the majority of cases, were similar. Operative time was, on average, 22 minutes longer. In follow-up surveys, all eight participants agreed that this rotation eased their transition to fellowship or independent practice.
The authors note that several factors contributed to the successful implementation of this endeavor. Contributing factors included multidisciplinary input from stakeholders, voluntary participation from chief residents, the identification of appropriate cases, clear expectations about attending involvement, block time for residents that did not interfere with attending efficiency, full disclosure to patients during operative consent, and tracking of patient outcomes.
The role of simulation in surgical training as an adjunct to time in the OR is well established. Over the recent decade, graduating residents performed 20–25 percent fewer cases, with a decrease in operative confidence observed by fellowship program directors.5 A rising number of programs in the country are incorporating simulation-based technical and nontechnical skills early on in surgical training to help counteract this trend. Furthermore, simulation-based curricula, including Fundamentals of Laparoscopic Surgery and, more recently, Fundamentals of Endoscopic Surgery, are now mandated by the Accreditation Council for Graduate Medical Education (ACGME).
Preoperative simulation is an effective adjunct to operative training in the decreased work-hours setting. A mastery-based simulation curriculum for robotic pancreatic resections designed for surgical oncology fellows at University of Pittsburgh, PA, comprising three- to eight-hour modules, resulted in significant measurable improvement in robotic skills. This study highlights the utility of a mandatory simulation-based curriculum, particularly for highly complex low-volume specialties such as hepatobiliary surgery.6 Shetty and colleagues studied resident perception and preferences regarding laparoscopic training across two institutions and found that 95.5 percent of the respondents believed simulation improved their skills and that the skills were transferable to the OR; 56.7 percent believed proficiency in a simulation curriculum should be mandatory before performing OR cases.7 Preoperative cadaveric surgical rehearsal speed, safety, technique, and anatomical knowledge improved resident confidence and perception of technique.5 These skills gained outside of the OR can facilitate autonomy in the OR.
Simulation in surgical training has been used not only to improve technical performance but also for overall team building to improve patient care. Nontechnical skills, including cognitive, social, and personal skills, have been shown to play a role in the recognition and management of adverse surgical events.8 Rao and colleagues showed that nontechnical and technical scores of postgraduate year one residents performing gastrojejunostomy in simulated settings improved significantly after a live demonstration on a porcine model by an attending surgeon followed by training tasks designed to teach teamwork-related skills.8
Throughout medical education there is growing interest in measuring competency, an approach that can help facilitate greater degrees of autonomy for learners. Competency in and of itself, however, is difficult to define, with potential for significant variability between assessors. Although competency-based curriculums are not a new concept in surgical education,9 more recent formulations of this technique have been published in the form of entrustable professional activities (EPAs). Defined as specific behaviors that a competent practitioner should be able to perform with an appropriate degree of skill and consistency, EPAs provide a framework for competency and seek to bridge the gap between theory and clinical practice.10 In May 2014, the Association of American Medical Colleges released a set of 13 EPAs—activities that medical students should be able to perform upon entering residency. With domains ranging from patient care to professionalism and interpersonal/communication skills, each EPA has a five-point entrustment and supervision scale with autonomy anchors.11 The principle of such a system is that once a trainee reaches a given level within a specific domain, this skill can then be transferred to alternate clinical scenarios. EPAs allow for entrustment decision making for designated levels of autonomy without the need for recurrent assessment.
Questions regarding the implementation of EPA curriculums have been raised, specifically how assessments of these entrustable activities should be performed. Reports of simulated scenarios, observation of direct patient care, and self-assessment have all been described, but the most efficacious strategy is still unclear.12-14 An additional area of concern is the potential for performance discrepancy between a reported EPA and observed EPA by supervising physicians of incoming residents.15 These concerns reveal the need for standardized assessment protocols prior to widespread implementation of EPAs. A five-year pilot study evaluating the implementation of EPAs in 10 U.S. medical schools is in process and is anticipated to provide clarity and future direction for the implementation of this framework.16 Regardless of the specific final structure of EPAs, this method of specifying levels of graduated autonomy clearly has gained traction in medical education.
Initially developed for undergraduate medical education, work increasingly is being published describing the creation and implementation of unique EPAs for graduate medical training programs. A series of 45 unique EPAs were created using a modified Delphi method in anesthesiology training programs in the Netherlands.17 Similarly, the Alliance for Academic Internal Medicine has published 16 EPAs that graduating internal medicine residents are expected to fulfill to enter unsupervised clinical practice.18 Other groups also have reported attempts to integrate EPAs with current ACGME milestones, signaling the crucial role EPAs will play in assessment and advancement of graduate medical trainees.19 A pilot of two general surgery EPAs will be implemented at 26 sites in 2018–2019.
The next logical step in the implementation of EPAs is the use of this framework to grant objectively graduated autonomy during residency training. Dissonance still exists, however, between EPA performance, faculty evaluations, and granted autonomy, suggesting the influence of individual interactions and perception upon structured competency scales such as these.20 Nevertheless, EPAs represent a growing framework that will likely structure future national discussions regarding competency and autonomy during residency training.
Feedback is essential to the effectiveness of competency-based training. The American Board of Surgery requires trainees to have at least six operative and six clinical performance assessments to achieve certification. Informal feedback might come in verbal form, such as telling residents they did a good job or that they will find a procedure easier to perform as they gain more experience. However, to be effective, feedback needs to be specific and to be received soon or immediately after the observed interaction.21 For residents, feedback gives insight into what they did well and opportunities for improvement.
Many tools designed to help structure feedback already exist, including the Objective Structured Assessment of Technical Skill (OSATS), the Global Operative Assessment of Laparoscopic Skills (GOALS), the Zwisch scale, and the Ottawa Surgical Competency Operating Room Evaluation (O-SCORE).
OSATS was developed at the University of Toronto, ON, to assess technical skills in a standardized fashion. In the original model, skills were assessed in both bench and live animal models, and two assessments were used (task-specific checklists for specific procedures, and a global rating scale [GRS]).22 The GRS of OSATS has been shown to be reliable and valid in multiple studies.23
The GOALS tool was developed specifically for laparoscopic procedures. GOALS used a checklist and GRS. The developers of GOALS acknowledged that their GRS was superior to their checklist in evaluating technical skills, confirming others’ studies that GRSs were better than checklists for skills assessments.24 This conclusion is logical because checklists only confirm that tasks have been performed; they do not gauge how well the tasks were completed.
The Zwisch scale is a summative feedback tool that provides a valid and reliable method to rate competence and autonomy. The four-level scale, from “show and tell” to “supervision only,” offers residents and faculty a “common vocabulary for describing stages of development and what is required to earn autonomy.”25,26
The O-SCORE is a surgical assessment tool to evaluate competence. It was designed to be a quick evaluation tool to evaluate competence in any procedure and has been studied in general surgery as well as orthopaedic residents. The O-SCORE is similar to the Zwisch scale in that it rates the trainee’s independence, but it applies to multiple items, including knowledge of steps, technical performance, postoperative plan, efficiency, and flow.27
Although these tools provide assessments of technical skills, they do not provide a platform for documenting feedback and, therefore, may be of limited effectiveness. An electronic-based system has been designed to overcome the inefficiency of paper documentation of feedback. SIMPL, the System for Improving and Measuring Procedural Learning, is a smartphone app that uses the Zwisch scale. The app automatically signals a participant to complete an evaluation as soon as another participant in the operation completes an assessment. The mobile device platform is convenient for feedback documentation because data are stored and can be accessed to measure longitudinal progression.26 For faculty, reviewing accumulated feedback on a particular resident allows tailored instruction and perhaps a framework for guidance on furthering autonomy in a future case.
Surgical educators are increasingly recognizing the importance of actively fostering self-regulated learning among trainees. The concepts of self-regulated learning form the foundation for much of the ongoing work related to competency-based education and resident autonomy. The principles of self-regulated learning, which are related to adult learning theory, highlight the importance of the surrounding environment, the structure of the material, and the role of the teachers or coaches in maximizing learning outside of a classroom setting.28 A key element is the practice of active reflection by an individual on their own performance and deliberate setting of goals related to improvement of performance.
Although self-regulated learning is an implicit part of training and ongoing clinical development after the transition to independent practice, it has rarely been recognized or taught as a specific skill.29 Recently, certain specialties have adopted tools and techniques for self-regulated learning, recognizing that the transition to independent practice at the end of residency or fellowship also is the moment when the responsibility for lifelong learning is fully transferred to the individual clinician. For example, in pediatrics, residents are required to develop and formally document their progress on individualized learning goals annually. While the use of similar goal-setting exercises is increasing across the medical education continuum, the impact of individualized learning plans on resident or patient outcomes has not been rigorously evaluated, and challenges in meaningful use and implementation have been recognized.30
The concepts of self-regulated learning are particularly applicable to procedural skills and form the theoretical underpinnings for the use of techniques such as simulation. Another technique that is recently attracting attention is surgical coaching, in which surgeons are formally assigned to expert mentors who review technical performance and provide feedback in structured ongoing sessions.31,32 A key element of the coaching model is a facilitated opportunity for self-evaluation, which is crucial to effective and efficient self-directed learning. Although recent work has been performed primarily in the context of Continuing Medical Education, many of the findings are easily translatable to the graduate medical education environment.
Increasing autonomy in the OR cannot begin with having a chief resident serve as the primary surgeon. The strategies described in this article build a foundation to support increased autonomy throughout training. At the base of the pyramid in Figure 1 on page 41 is a competency-based curriculum. Structured clinical opportunities such as resident-run services and clinics allow for real-life opportunities to practice autonomous behaviors. Frequent feedback identifies areas of growth that can be corrected and enhanced via simulated exercises. And finally, self-directed learning allows for fine-tuning of skills. These principles cannot be implemented without institutional buy-in, faculty development, transparency, and reiterative assessment of the system.
- Wojcik BM, Fong ZV, Patel MS, et al. The resident-run minor surgery clinic: A pilot study to safely increase operative autonomy. J Surg Educ. 2016;73(6):e142-e149.
- Kantor O, Schneider AB, Rojnica M, et al. Implementing a resident acute care surgery service: Improving resident education and patient care. Surg. 2017;161(3):876-883.
- Sandhu G, Magas CP, Robinson AB, Scally CP, Minter RM. Progressive entrustment to achieve resident autonomy in the operating room: A national qualitative study with general surgery faculty and residents. Ann Surg. 2017;265(6):1134-1140.
- Wojcik BM, Fong ZV, Patel MS, et al. Structured operative autonomy: An institutional approach to enhancing surgical resident education without impacting patient outcomes. J Am Coll Surg. 2017;225(6):713-724.e2.
- Weber EL, Leland HA, Azadgoli B, Minneti M, Carey JN. Preoperative surgical rehearsal using cadaveric fresh tissue surgical simulation increases resident operative confidence. Ann Transl Med. 2017;5(15):302.
- Hogg ME, Tam V, Zenati M, et al. Mastery-based virtual reality robotic simulation curriculum: The first step toward operative robotic proficiency. J Surg Educ. 2017;74(3):477-485.
- Shetty S, Zevin B, Grantcharov TP, Roberts KE, Duffy AJ. Perceptions, training experiences, and preferences of surgical residents toward laparoscopic simulation training: A resident survey. J Surg Educ. 2014;71(5):727-733.
- Rao R, Dumon KR, Neylan CJ, et al. Can simulated team tasks be used to improve nontechnical skills in the operating room? J Surg Educ. 2016;73(6):e42-e47.
- McGaghie W, Miller G, Sajid A, Telder T. Competency-Based Curriculum Development in Medical Education: An Introduction. Geneva, Switzerland: World Health Organization; 1978. Available at: www.who.int/iris/handle/10665/39703. Accessed June 22, 2018.
- Englander R, Flynn T, Call S, et al. Toward defining the foundation of the MD degree: Core entrustable professional activities for entering residency. Acad Med. 2016;91(10):1352-1358.
- Association of American Medical Colleges, Chen HC, van den Broek WES, Ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90(4):431-436.
- Thompson L, Leung C, Green B, et al. Development of an assessment for entrustable professional activity (EPA) 10: Emergent patient management. West J Emerg Med. 2017;18(1):35-42.
- Hauer KE, Soni K, Cornett P, et al. Developing entrustable professional activities as the basis for assessment of competence in an internal medicine residency: A feasibility study. J Gen Intern Med. 2013;28(8):1110-1114.
- Klapheke M, Johnson T, Cubero M. Third- and fourth-year medical student self-assessment in entrustable professional activities. Med Sci Educ. 2017;27(2):1-6.
- Pearlman RE, Pawelczak M, Yacht AC, Akbar S, Farina GA. Program director perceptions of proficiency in the core entrustable professional activities. J Grad Med Educ. 2017;9(5):588-592.
- Lomis K, Amiel JM, Ryan MS, et al. Implementing an entrustable professional activities framework in undergraduate medical education: Early lessons from the AAMC core entrustable professional activities for entering residency pilot. Acad Med. 2017;92(6):765-770.
- Wisman-Zwarter N, Van Der Schaaf M, Ten Cate O, Jonker G, Van Klei WA, Hoff RG. Transforming the learning outcomes of anaesthesiology training into entrustable professional activities. Eur J Anaesthesiol. 2016;33(8):559-567.
- Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training. Acad Med. 2015;90(4):479-484.
- Carraccio C, Englander R, Gilhooly J, et al. Building a framework of entrustable professional activities, supported by competencies and milestones, to bridge the educational continuum. Acad Med. 2017;92(3):324-330.
- Wagner JP, Lewis CE, Tillou A, et al. Use of entrustable professional activities in the assessment of surgical resident competency. JAMA Surg. 2018;153(4):335-342.
- Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(Supplement):S70-S81.
- Martin JA, Regehr G, Reznick R, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273-278.
- Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB. Observational tools for assessment of procedural skills: A systematic review. Am J Surg. 2011;202(4):469-480.
- Vassiliou MC, Feldman LS, Andrew CG, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190(1):107-113.
- DaRosa DA, Zwischenberger JB, Meyerson SL, et al. A theory-based model for teaching and assessing residents in the operating room. J Surg Educ. 2013;70(1):24-30.
- George BC, Teitelbaum EN, Meyerson SL, et al. Reliability, validity, and feasibility of the Zwisch scale for the assessment of intraoperative performance. J Surg Educ. 2014;71(6):e90-e96.
- Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE): A tool to assess surgical competence. Acad Med. 2012;87(10):1401-1407.
- Taylor DCM, Hamdy H. Adult learning theories: Implications for learning and teaching in medical education: AMEE Guide No. 83. Med Teach. 2013;35(11):e1561-e1572.
- Nothnagle M, Anandarajah G, Goldman RE, Reis S. Struggling to be self-directed: Residents’ paradoxical beliefs about learning. Acad Med. 2011;86(12):1539-1544.
- Lockspeiser TM, Kaul P. Using individualized learning plans to facilitate learner-centered teaching. J Pediatr Adolesc Gynecol. 2016;29(3):214-217.
- Greenberg JA, Jolles S, Sullivan S, et al. A structured, extended training program to facilitate adoption of new techniques for practicing surgeons. Surg Endosc. 2018;32(1):217-224.
- Min H, Morales DR, Orgill D, Smink DS, Yule S. Systematic review of coaching to enhance surgeons’ operative performance. Surg. 2015;158(5):1168-1191.