Skip to main content

GETS SMARTER

SIMULATION

GETS SMARTER

AI opens a new chapter in the Gordon Center’s innovative medical training and education programs

BY CHAD HANSON
ILLUSTRATIONS BY SPOOKY POOKA / PHOTOGRAPHY BY JEFFERY SALTER

Since the launch in 2022 of OpenAI’s ChatGPT and various other tools, the possibilities offered by artificial intelligence have been a source of fascination and speculation in the field of medicine. But underneath lurks a wary question: Will computers replace physicians?

Barry Issenberg, M.D. ’95, director of the Gordon Center for Simulation and Innovation in Medical Education at the Miller School, says that’s not a real threat.

“The discussions are actually about how AI can augment, not replace, the work of a practitioner,” Dr. Issenberg said. “A lot of the applications of AI within simulation training and patient care are about augmenting human performance.”

Dr. Issenberg and colleagues are in the early stages of using AI and other new technologies to enhance the training the Gordon Center provides to more than 20,000 medical students, providers and emergency responders every year. He believes their application to the Gordon Center’s simulation training will evolve quickly, and to the benefit of trainers and trainees alike.

A History of Innovation

The Gordon Center’s innovative use of simulation in medical training and education stretches back nearly 50 years, and its intellectual underpinnings even further. The designated Center of Excellence owes much of its international reputation to Michael S. Gordon, M.D., Ph.D., a practicing cardiologist who, in the late 1960s, recognized the potential of simulation for creating realistic training scenarios for aspiring physicians.

Dr. Gordon was a pioneer in the early integration of technology and medical education, a techie before tech was cool. In 1968, he unveiled Harvey®, a cardiology patient simulator whose direct descendants continue to be a vital training tool for medical education.

Dr. Issenberg and his colleagues are continuing Dr. Gordon’s forward-thinking work. Patient simulators are now produced in different genders, ages and skin colors, with increasingly sophisticated computer-programmed responses that better mimic human medical conditions. In addition, 3D printing is used to create anatomically accurate models for surgical planning and training. It allows for customization of patient-specific anatomy, enhancing the realism and relevance of surgical rehearsals. Bringing health care to the metaverse, wearable technology devices such as smart glasses or haptic feedback gloves enhance simulation training, as they provide a more immersive experience and simulate more realistically the audiovisual and tactile feedback of medical procedures.

“Performance in a simulation can predict real-world behaviors,” said Ross Scalese, M.D., the Gordon Center’s director of educational technology development and professor in the Miller School’s Department of Medicine. “The more authentic the simulations are, the more likely the training exercises will translate to better patient care in actual clinical environments.”

A new program takes one group of caregivers a step further: The community paramedicine program teaches paramedics how not only to recognize and evaluate medical problems, but also to identify patients’ social determinants of health, such as housing or access to medicine, and refer them to services as needed. The goal is to reduce hospital readmissions of people whose living situations keep them coming back because they never really get well.

“WITH AI, THE FUTURE WILL BE DEFINED BY ITS INTEGRAT10N W1TH 0THER EMERG1NG TECHN0L0G1ES. H0W D0 WE INC0RP0RATE IT W1TH V1RTUAL REAL1TY AND AUGMENTED REAL1TY AND M1XED REAL1TY?”

Faster Training Evaluations

Dr. Issenberg believes that AI will be most useful to simulation training in evaluating performance for procedure-based tasks. Surgery is an obvious match, as it relies on a specific procession of steps that lead to a successful outcome. The current process often involves faculty reviewing lengthy videos of surgical trainees to determine if those steps were followed correctly. Absent technological intervention, that approach takes a long time and can lead to waning concentration after a few hours. This is where AI — augmented intelligence, in this case, as it’s helping rather than replacing a person — can prove beneficial.

“The procedure’s order and steps are well defined, so we can develop algorithms for an AI application and then feed the application training videos,” Dr. Issenberg said. “The application learns from those videos, and after a relatively short period of time, it’s as accurate as an expert.”

The expertise the AI application develops enables it to pinpoint errors and reduces the time needed for review from hours to minutes. AI also provides the opportunity to assess the effectiveness of the curriculum itself by identifying where trainees struggle or perform without proficiency. That assessment allows faculty instructors to review and revise training materials as soon as an issue surfaces, rather than waiting for the end of the semester.

“AI can be used to identify the components of the case people are having difficulty with,” Dr. Issenberg said. “And we can make tweaks in real time. AI will enable us to have a much more individualized curriculum and adaptive learning.”

Realistic Curriculum Development

In many ways, the transition from medical student to physician is a movement from theory to practice. Students absorb theories from textbooks to learn how to address medical problems and put that knowledge into practice when the “M.D.” formally appends their name. Problems arise when patients deviate from the textbook.

“Medical students read about one or two prototypical cases in a textbook,” Dr. Issenberg said, “and then on their first day at UHealth Tower or Jackson Memorial Hospital, they see a patient who doesn’t look anything like what they read about in a textbook or learned during a lecture.”

Simulation training strives to bridge the gap between theory and practice by accounting for deviations from the norm and presenting as varied a curriculum as possible. But time, that relentless obstacle, can make that challenging.

“Having a large number of training cases is important,” Dr. Issenberg said, “but it takes a lot of time for faculty to come up with these cases.”

AI, however, doesn’t need much time to synthesize a large volume of de-identified patient notes and capture all the variables in that data set.

“In a matter of minutes, we can create a large library of patient simulation cases using AI,” Dr. Issenberg said. “Faculty can then go in and correct any mistakes, any biases that may occur.”

Better Scripts for Patient Scenarios

Vivian Obeso, M.D. ’00, an associate professor of medical education, is spearheading a project that does just that. Dr. Obeso is in the early experimental phase of using ChatGPT to generate realistic, varied patient scenarios that can be used for clinical training and assessment. Currently, Miller School faculty develop those scenarios, which are translated to conversational scripts that actors use in face-to-face encounters with medical students. It can be arduous work.

“The scripts need to be developed carefully,” Dr. Obeso said. “It usually takes at least a couple of faculty members to write them and get the appropriate feedback. The scripts must be accurate enough to portray a realistic patient scenario. It can take hours, sometimes days.”

Not with ChatGPT and similar generative natural language processing applications. Dr. Obeso is posing and refining prompts that include information about de-identified patient symptoms, health history and physical examination findings, then asking ChatGPT to produce plausible scripts that support a given differential diagnosis. The preliminary results demonstrate that AI can do a pretty decent imitation of a real patient scenario with a carefully developed prompt.

“I started with broad questions but found that ChatGPT does better with more detailed inquiry,” Dr. Obeso said. Given the proper specifics, “it does a good job generating patient cases that include pertinent positive and negative findings, as well as a differential diagnosis,” she added.

That’s not good enough to forego physician oversight, but Dr. Obeso plans to organize a study that will ask faculty physicians to review 100 or more ChatGPT-generated scenarios for accuracy.

“We’ll be able to identify trends and see what ChatGPT is doing really well,” she said, “and address why it’s struggling in certain areas.”

AI then will be able to alleviate a good portion of the labor-intensive scenario-creation burden. The faculty physicians who endorse the training curriculum will focus on fine-tuning the AI output with subtle variations to the questions that ChatGPT “consumes.”

This approach should be especially useful in the Miller School’s monthlong Transition to Residency course — nicknamed “Residency Boot Camp” — that is now required for all fourth-year students.

“We feel this type of simulation training in acute medical emergencies is critical for every graduating student,” said Gauri Agarwal, M.D. ’00, associate dean for curriculum and associate professor of medicine. “Students are trained not only to recall critical pieces of medical knowledge, but also to practice their composure, communication, decision-making and teamwork skills with patients and colleagues.”

“THE M0RE AUTHENT1C THE S1MULAT10NS ARE, THE M0RE L1KELY THE TRA1N1NG EXERC1SES W1LL TRANSLATE T0 BETTER PAT1ENT CARE 1N ACTUAL CL1N1CAL ENV1R0NMENTS.”

Matching Training with Learners

The effectiveness of a training curriculum isn’t only measured by its fidelity to the subject matter. Successful learning also requires a suitable alignment of topic presentation and end-user knowledge. A lesson plan that’s too challenging stymies students. One that’s too easy bores them.

The Gordon Center’s partnership with the American Heart Association uses AI to walk that fine line in its Advanced Stroke Life Support Blended Learning course, which teaches providers how to identify and treat patients having a stroke. Each training session is different, based on the prior knowledge and experience of the student taking the course.

“Education tends to be time-driven, not competency-driven,” said Ivette Motola, M.D., M.P.H., associate professor of surgery and the Gordon Center’s assistant director and director of the prehospital and emergency training division. “But clinicians are incredibly busy. Time is one of our most important resources.”

AI makes sure the Blended Learning course, which is appropriate for a broad range of learners, doesn’t waste their time. At the outset, students identify their level of expertise. Based on that self-assessment, the course then provides interactive content rich in graphics followed by assessment or takes them straight to assessment to demonstrate their expertise in stroke care. With each answer, an AI-driven algorithm evaluates the student’s knowledge and level of confidence and calculates what content to present and what question to ask next.

“We’ve been talking about this in education for a long time,” Dr. Motola said. “This enables us to provide truly individualized content using a mastery learning model.”

Emergency Identification

In emergency situations, paramedics are ultimately trying to answer a self-evident question: How can I help this person?

A less obvious query — Who is this person? — is crucial to answering that first question. Patient identity carries with it their medical history, including previous diagnoses, hospitals where they’re most often seen, medications taken and known allergies, all of which factor into the paramedic response. People who are unable to tell emergency responders who they are due to cognitive deficit or temporary incapacitation complicate these already high-pressure situations.

Dr. Motola and colleagues are turning to AI-driven facial recognition software to identify patients who, due to unconsciousness or conditions like dementia, are unable to identify themselves. In a study, paramedics were divided into two groups. One loaded a facial recognition program on their phones that provided both the patient identity and important medical information and used it in a realistic clinical simulation scenario involving standardized patients (actors) who were unable to identify themselves. The study’s control group didn’t have access to facial recognition software. Both groups were asked to treat the standardized patients as they would in a real emergency.

“We looked at the paramedics’ decision making, in particular if the paramedics took the patients to the hospital where they receive most of their care,” Dr. Motola said. “That’s very important, because having access to prior medical history, especially in complex cases, is critical for the best patient management and outcomes.”

The study group that used facial recognition outperformed the control group by making better initial decisions about treatment and by taking the patients to their preferred medical facilities.

“Having access to the facial recognition software impacted the paramedics’ medical decision making in a positive way,” Dr. Motola said.

Changing Reality

With AI, the future is always now, and today’s revolution is tomorrow’s antiquity. Dr. Issenberg is already anticipating — and relishing — how simulation will change with technology’s relentless shifting and churning.

“With AI, the future will be defined by its integration with other emerging technologies,” Dr. Issenberg said. “How do we incorporate it with virtual reality and augmented reality and mixed reality?”

Extended reality is an umbrella term that encompasses a variety of technologies, including virtual reality (an immersive, completely digital environment), augmented reality (the incorporation of virtual objects into real environments), and mixed reality (the interaction between real and virtual objects).

Extended reality is currently used in immersive video games and the creation of new automobile designs. In health care, some hospitals use virtual reality headsets to ease patient stress prior to surgery and smartphone-accessible educational applications to introduce care plans.

Dr. Issenberg believes extended reality will soon be a central component of enhanced training theaters at the Gordon Center. He uses a particular procedure, chest tube insertion, as an example. Manikins have come a long way since Harvey 1.0, but they don’t offer a complete portrayal of the human chest cavity.

“Using extended reality, we’ll be able to superimpose the anatomy of a patient on the simulator so the trainee can see the underlying structures of the chest,” Dr. Issenberg said.

In fact, the tube a trainee inserts into the chest may be AI-enhanced, which would also allow the application to precisely monitor its movement.

“We’ll be able to see where that tube is going,” Dr. Issenberg said, “and give the learner feedback in real time, with comparisons to their own benchmarks and, most importantly, the gold standard for that training.”

“AI-DR1VEN S1MULAT10NS ARE HEAV1LY REL1ANT 0N THE QUAL1TY 0F DATA THEY ARE FED. IF THE DATA ARE INC0MPLETE, 0UTDATED 0R B1ASED, IT CAN LEAD T0 INACCURATE S1MULAT10NS.”

AI Training’s Challenges

While AI’s anticipated impact on medicine is exciting, “overreliance on AI in training simulations could potentially diminish the trainees’ ability to develop critical thinking and decision-making skills,” Dr. Issenberg said. “They might become too dependent on AI recommendations and analysis. As educators, we need to guard against that effect and demonstrate the serious implications of relying too much on this new technology.”

As Dr. Obeso learned in her first forays into ChatGPT, AI is only as good as its source material. Bad data means inadequate or even harmful training.

“AI-driven simulations are heavily reliant on the quality of data they are fed,” Dr. Issenberg said. “If the data are incomplete, outdated or biased, it can lead to inaccurate simulations, which may misguide trainees rather than helping them.”

And as patient data become potentially more accessible, privacy will gain even more importance. Academic medical centers like the Miller School are bound by the strict dictates of the Health Insurance Portability and Accountability Act of 1996, commonly referred to as HIPAA, which protects sensitive patient information. Dr. Issenberg recommends the strongest measures in ensuring confidentiality and hopes patient communications will evolve to include explicit details about the use of patient data for research and education.

Dr. Motola agrees: “Data security is going to be huge, because with facial recognition, for instance, identity means access to an additional level of the patient’s medical information,” she said. “One of the challenges is to ensure it is protected. How individuals consent to have their data used for both training and clinical care purposes will have to be integrated into patient care interactions, and data security will need to be at the highest level and monitored vigilantly.”

UNIVERSITY OF MIAMI MEDICINE
SPRING 2024