Patient Simulators: From CPR Dummies to Mixed Reality High-Fidelity Robots
Complete the form below to unlock access to ALL audio articles.
When today’s dynamic, programmable robotic patient simulators are used for healthcare education, they mimic the human anatomy, speak, cry, have eye movement and facial expressions while reproducing physiological functions. Patient simulators are found in major teaching hospitals, nursing schools, medical military training and paramedic training centers around the world. These simulators can range from basic trainers through to full body high-fidelity simulators and can simulate multiple complex medical complications. Simulators play an important role in preparing the next generation of healthcare practitioners.
Technology Networks spoke to James Archetto, vice president at Gaumard Scientific, to learn more about the history of patient simulators, most recent advances and what is on the horizon.
Anna MacDonald (AM): Can you tell us about the origin of patient simulators and their evolution?
James Archetto (JA): Early on it became evident that practice was necessary to improve medical outcomes. This started with cadavers, in the beginning, and transitioned into skeletons. Dr. George Blaine, founder of Gaumard Scientific, was a physician with the British army during World War II. At this time, he realized that we didn't always need to rely on natural products like a cadaver or a skeleton for training and he invented the world's first synthetic skeleton. This skeleton could be produced at substantially lower cost without the complications of a biologic based trainer.
Later, when Dr. Blaine was in North Africa, he observed that the infant mortality rate was extremely high, so he invented the world's first birthing simulator in the 1940s. This was rudimentary: a box with a clear cover and a doll, which would be pushed through the box simulating how the baby would move through a birth canal. This evolved into CPR dummies which were a way to practice actions such as breathing and chest compressions. Later on, advances in technology came and the business of simulation expanded tremendously. Medical simulation and training is now at a point where life size adult female simulators deliver baby simulators, that can move, cry, be jaundiced, or cyanotic. These simulators now talk to and interact with the learners. It is certainly far more advanced than those first iterations in the 1940s.
To create an immersive learning experience, Gaumard is always working to enhance outcomes of students. Every teacher tries to do this, tries to understand what is going to allow students to remember the lesson when they need it the most – in clinical practice. The realism of simulators creates that immersive experience.
AM: How are the simulators developed?
JA: When medical simulators are being developed, we first analyze medical trends. For example, is there a higher incidence in either mortality or morbidity for a certain medical condition? We try to figure out how we can develop a product that will enhance learning outcomes to ultimately improve learner competencies. Second, we look at both high-risk, low-frequency procedures and low-risk, high-frequency procedures. An example of a high-frequency, low-risk procedure would be an intramuscular injection. You go to the doctor and need an injection – it's very low-risk, very high-frequency. Every healthcare provider performs these every day and the complication rate is low. Something a little more complex is placing an intravenous cannula. It is still relatively low risk, but requires more skill and needs more practice. Additionally, there are those procedures that happen very infrequently, and may not even occur during clinical rotation for a nurse or in a residency program for a physician. Simulators can create a scenario that duplicates this complication, to practice over and over and build that proficiency with the learners. In addition to need, we also assess development from the technology standpoint. What can we build given current advancing technology that will satisfy that clinical training need and improve competency of the learners? This takes years. It's easy to take a mold of a person's arm and put some synthetic veins in it to practice intravenous needle insertion. Simulators delivering babies or bleeding in conjunction with every simulated heartbeat, when the simulator doesn't have a heart, connected to real medical equipment, is far more sophisticated and requires more advancement. Simulators are developed through a combination of clinical expertise, educational need, and technology through engineering.
AM: What makes these patient simulators so astonishingly realistic?
JA: There are several components. First, we assess how we can immerse the learners into the experience, and physical realism is important. But physical realism is just a small component of the immersive experience. For example, a department store mannequin can be physically very realistic. But there is no clinical relevance. So, how do you create that realism and immerse the learners? We create the physical and also that clinical realism which is vital.
Emergencies don't always happen in a hospital with a patient on a bed with EKGs on them. Frequently they happen in a playground, in a car, on the side of the road or at home. To recreate this clinical realism, simulators can’t be powered by a power cord since that would derail the immersive clinical experience. So, our simulators are powered by a battery. And these emergencies take time, so that battery has to have sufficient life for the duration of the learning experience. Simulators have to be able to be controlled without any wires or cords attached to them, so they have to be wireless, either by radio frequency or Bluetooth. This provides a freestanding simulator, which ours are, that are controlled remotely by a tablet. Additionally, to provide a true clinical experience, this simulator must connect to real medical equipment and respond to it. In the old days, with department store mannequins, learners were pretending and would need to imagine that the blood pressure has dropped. We don't have to do that anymore to create this realism. These simulators automatically lower the blood pressure, change the heart rate, change the breathing rate, perform all of these complicated physiological responses with the particular clinical condition, and it is all monitored on real medical equipment. All of this adds to the realism to improve the competency for the learners.
One example is a military simulator with an amputation. That simulator has a reservoir of synthetic blood in it that actually spurts like a severed artery at the same exact rate as the heartbeat that's being shown on the EKG. There is also a sensor built into the simulator such that when a torniquet is applied at the appropriate pressure, blood flow will be reduced. All of this is coordinated and synchronized, through this advanced technology which adds to that astonishing realism of the experience. It's far more than just the physical.
AM: What are the most challenging aspects of creating a life-like simulator?
JA: There are many components that make developing a life-like simulator challenging. Is this simulator going to duplicate what a patient experiences in this scenario? Is it going to meet the educators expectations of how a simulator should perform or how a patient is going to respond? Will it serve to enhance the competency of the learner? In medicine there are many variables, which is where the challenges come in with these simulators and making them.
What is also challenging about making simulators is how to fit all the technology that we've been discussing into a small package space. Let's use one of Gaumard’s simulators called Super Tory, as an example. Super Tory is a newborn that weighs 8 pounds and is 21 inches long. How can all of that technology fit into such a small package, so she breathes, works with real medical equipment, cries and moves? It's only with the advancements in technology that miniaturizes the components such as pumps, circuit boards, processors, motors, and a battery that allows her to appear so realistic.
Another key component and key challenge is reliability. The simulators have to work and they have to work in these most challenging of clinical situations. We talked about the low frequency, high risk scenarios, often times those need to be practiced multiple times. The simulator needs to be reliable, to work over and over. Before we ever get to a finished product, all of those motors and pumps have to be tested for reliability.
Super Tory has motors behind her face and that makes her grimace, squint, open her mouth and cry. Those motors work in unison, reliably, over and over. She has tiny little motors attached to silicone skin that performs reliably and allows her expressions to be realistic. Those are just a few examples of the challenges faced in developing a simulator.
AM: How has simulation-based learning been applied throughout the COVID-19 pandemic?
The pandemic has challenged a lot of premises for how we can educate and how learners gain clinical experience. Previously, learners practiced on each other and when they became proficient at a given skill then they practiced on patients. A few years ago, this changed with the National Council of State Boards of Nursing (NCSBN) allowing simulators to be utilized for up to 50% of clinical training. There were two reasons for that number. One is that there has been an increase in reluctance of patients to provide informed consent for training. Second, simply getting the clinical time in a clinical setting is becoming more difficult. It was agreed that if properly designed, a simulation scenario and protocol could be used as an adjunct to clinical training. That is the standard that has been used since 2015.
With the onset of the global pandemic in 2020 students were not deemed critical employees and suddenly there was no access to hospitals. In addition, there was reluctance of some nursing students to incur the potential risk of infection in hospitals. Now how do learners gain access to patients for learning? Both educators and students used simulation to further clinical training. Then the schools closed down and leaners and educators couldn’t even get to schools. With our help and support, simulation centers were able to implement remote learning centers. Because our simulators are wireless and tetherless, they could be operated remotely. The educator didn't need to be in the building where the learners were and could set up the protocol and the learning scenario remotely. The learners would go into the simulation center individually and practice the clinical scenario.
We found creative ways of providing simulation courses through Zoom or Microsoft Teams teleconferencing. All of these remote access technologies allowed learners to gain some of these clinical skills. They could practice the scenario regardless of how sophisticated it was without fear of jeopardizing their health or the instructor fear of jeopardizing his or her health.
Now a year later, looking back, we realize that the faculty member doesn’t have to be in a control room or the next classroom or have 20 students in a room and be briefed all at once. We can have 20 people on a Zoom call in a gallery view all watching from the scenario that these learners are performing and then we can all debrief simultaneously. We learned quite a bit. Necessity is the mother of invention. This technology would have changed the way simulation and education in general takes place.
When we also look from a traditional education standpoint at the value that educators provide – an educator can be anywhere and the learners can be anywhere because technology allows it. It’s the same with simulation – the pandemic really forced us to rethink and come up with very clever solutions.
AM: Can you tell us more about mixed reality simulators and the advantages this approach offers?
JA: There are several different ways that we can modify reality. One is called virtual reality – this involves putting on opaque goggles. Essentially there is a movie playing inside the goggles. It is very immersive and you feel like you're in the jungle, or underwater. You are in a game perhaps, and you feel like you're there because you're wearing these goggles. These goggles can be connected to a seat so you can feel like you're driving a race car. Of course there is no interaction with the real world so this is called virtual reality.
The second modality is augmented reality which takes a digital representation and overlays that into something physical and you can see what's there. This too, involves goggles, however, they allow visualization of the physical and virtual world. For example, a cartoon avatar could be made to dance on a desktop. There is no interaction with either the physical or virtual since the image is merely superimposed on a backdrop.
Mixed Reality is quite different because it allows interaction between the physical and digital world. We use goggles from Microsoft called the HoloLens 2. Just like wearing normal glasses, the Hololens 2 allows the learner to engage with the real world. In addition, they allow visualization and interaction with a specific digital clinical scenario. What you're seeing, and interacting with, is a clinical reality, and in our case, what's inside the patient. In medicine, that haptic experience is vital: to touch patients, to feel a pulse, hear a heartbeat or lung sounds, and even to catch the baby when the baby is born. That is not possible in a virtual world because you can't see your hands. With mixed reality, you can see the baby inside the uterus as she moves through the birth canal, and then catch the baby as she is delivered. If there is a complication, as a learner you could observe that complication and think, what should I do about it? What if I maneuver the patient a certain way and receive feedback that I performed the procedure correctly? Let's say the baby's shoulder is caught on the mother’s pubic bone, a complication called shoulder dystocia. As a learner, I may only see that once in my clinical training. However, I can practice that over and over again, and now I can see the baby in utero and that the shoulder is caught. I perform a procedure and through the HoloLens an indicator lights green when I resolved the issue; when I press down with just the right amount of pressure and performed the procedure to safely deliver a baby in the presence of shoulder dystocia. Now I know how much pressure I've applied. That can now be performed in mixed reality and I don't need to wait until the debriefing, which maybe 20 minutes later. That's how mixed reality works – the incorporation of the digital world and the real world come together to really enhance outcomes for our learners.
AM: Where do you see the technology headed? What features are on the horizon?
JA: When looking to the future, we work with stakeholders and ask a lot of questions. The first, are health care educators. What are the needs of learners? How have these needs changed? What educational changes are taking place?
The second are healthcare providers. What changes are taking place in clinical practice? What is changing to affect morbidity and mortality rates? How can we develop products to meet these needs?
Likewise, technology is changing at breakneck speed. How can these technological changes be incorporated to improve educational outcomes? Processor speed is improving, batteries are getting more powerful with a longer life, motors and pumps are smaller and more reliable. What impact will AI have in medical simulation?
Simulation, like the technology that drives it and the medical community it supports, is dynamic, and we must be nimble enough, to utilize this technology to maximize the benefits for our stakeholders.
James Archetto was speaking to Anna MacDonald, Science Writer for Technology Networks.