HospitalsTop Stories

First Training Simulation Study to Cross Orthopedic Sub-Specialties—and Institutions

Elizabeth Hofheinz, M.P.H., M.Ed.

The original meaning of “simulate” was “feign, pretend, assume falsely.”1 One assumes, however, that the word “pretend” is not so welcome in an OR.

But indeed, that is where surgical trainees often begin learning certain skills. And if you’re going to pretend, you might as well do it right.

Now, a team of orthopedic researchers from the University of Iowa is spearheading an effort to rectify the disconnect in residency training between simulation, surgical skills training, and operating room performance.

Geb Thomas, Ph.D. and Donald D. Anderson, Ph.D. have embarked on a mission to alter the field of orthopedic surgical training. Their overarching goal? Demonstrate that orthopedic surgical skill competence can be objectively, quantitatively, and reliably measured from behaviors observable in fluoroscopy and videography already routinely collected in the OR. And, integrate this mode of evaluation into residency training.

Dr. Thomas, a Professor of Industrial and Systems Engineering, told OSN: “The American Board of Orthopaedic Surgery (ABOS) has been hesitant to prescribe simulation training methods for orthopedic residents, in part because some believe there is still insufficient evidence that such training can improve OR performance. In research funded by the ABOS, we and others have been gathering evidence that simulation enhances patient safety…and it is time to move this toward integration into residency training programs.”

Donald D. Anderson, Ph.D. is the Richard and Jan Johnston Chair in Orthopedic Biomechanics and Professor and Vice Chair of Research, Orthopedics and Rehabilitation at The University of Iowa. “When residency programs hear ‘simulation training’ they think, ‘Oh, here’s another unfunded mandate,’ said Dr. Anderson to OSN. “Surgery missteps ultimately costs thousands of dollars and can have serious, often chronic, repercussions for patients. Many would prefer to use very low-tech simulation approaches…involving hardware store purchases and/or fruit/vegetables from the grocery store…when they should focus more on investing in patient safety. Think about it…while proponents of simulation state that “a pilot is never permitted to fly a plane without having logged a certain number of hours in a simulator,” they neglect to acknowledge how expensive those simulators are and how they came to become the standard.

Are the logs lying?

Performance assessment is an important element of this equation. “Currently, residents demonstrate competency by logging how many cases they have done in different subspecialty areas. Here, they are limited by work hour restrictions and that this log just contains cases that you happen to come across during your residency training. These logs are also notoriously limited in veracity. All they say is basically, ‘I did something.’ What it doesn’t tell us is how much help that person required in order to do the task or what was their actual involvement in the case.”

Put apprenticeship model in a new context?

Dr. Anderson: “At best, our current evaluation methods are disjointed. If someone trains in a skills lab there is no established competency level that dictates when he or she is ready for the OR. Our work and that of others has demonstrated that simulation is valid—it’s time for this to be more tightly integrated into resident training. Furthermore, quantitative metrics for assessing performance in the OR are becoming available, aided by intra-operative radiography and video that provides a record of performance. But residency isn’t set up for this and in fact is still wedded too exclusively to the apprenticeship model, i.e., …‘Sure, looks good enough…you are ready.’  That’s going to mean, for example, that at times there are more fluoroscopic irradiations during surgery and additional unnecessary (empty) wire tracts left in patients.”

Dr. Thomas: “If each institution, which has 6-8 residents per year, would start pulling in the same direction on this initiative then we would be able to establish concrete OR performance measures. It begins with the simple task of saving all images/videos acquired during surgery, rather than casually discarding them. Then everyone would begin to see the value of more meaningfully logging performance in the OR.”

“Having a structured, low-risk environment for a first opportunity to practice a skill without the chaos of the OR is invaluable. Then you have genuine teaching moments, such as when a doctor is leaning over a resident who is working with a simulator and says, ‘Oh I like to lift my hand up here because XYZ.’”

Ask the right questions

Drs. Anderson and Thomas are looking into questions such as, “How can we characterize the way a surgeon interacts with the resident who is learning? How can we most efficiently evaluate different teachers and styles so that residents might learn from each person? How can we most effectively share information across institutions?”

“It is important,” says Dr. Anderson, “to work with the radiology department to appropriately store the images. This may at certain institutions involve tapping into a hospital’s quality control projects as motivation for collecting these useful data.”

In their continued work, says the researchers, they will demonstrate that an integrated, scientifically justified orthopedic simulator training program will lead to increased patient safety.

“As part of this,” says Dr. Thomas, “we will objectively measure differences in resident OR performance using surgical imagery. In addition, we will establish the most effective training approaches for protecting patient safety by measuring the OR performance of residents across institutions.”

“This initiative,” added Dr. Anderson, “is the first to integrate study across specialization areas within orthopedics and performed across multiple institutions. We think that this will lead to a wide range of evidence-based information that can improve surgical training and reduce costs.”

Dr. Thomas: “This study is breaking new ground in terms of what to look for, what to measure, and what’s not worth measuring. Answering those questions could profoundly improve training. For example, while using fluoroscopy to place a dynamic hip screw, we can watch the path of the surgical wire as it moves throughout the procedure. We can have a clear idea of where the wire should be headed and can watch as the surgeon trainee overcomes various challenges. The image sequence sometimes reveals that someone turned left when he should have turned right. We can record that as a ‘miss’ or a ‘strike,’ then figure out what causes that type of mistake and how to train it away.”

Right, says Dr. Anderson. “If all you can do is say that someone takes longer, then you have no way to understand why. But if you can ‘drill down’ you will see that the person turned left instead of right.”

Asked about the biggest impediment to progress, Dr. Anderson commented, “It is the deeply entrenched training model that relies too exclusively on apprenticeship. It is time to rid our residency training of simple statements/approaches that say, ‘I can tell you when a resident is ready to advance’ and ‘You simply can’t measure some things.’”

Yes, says Dr. Thomas, change ushers in a sense of discomfort.

But really, aren’t we all getting more comfortable with discomfort these days?

References:

  1. https://www.etymonline.com/word/simulate

Josh Sandberg

Josh Sandberg is the President and CEO of Ortho Spine Partners and sits on several company and industry related Boards. He also is the Creator and Editor of OrthoSpineNews.

Related Articles

Back to top button