A virtual revolution: The changing landscape of medical technologies

Virtual reality (VR) and immersive environments are dramatically evolving today’s medical training and device testing.

At a given instant, everything the surgeon knows suddenly becomes important to the solution of the problem. You can’t do it an hour later, or tomorrow. Nor can you go to the library and look it up. ~ John W. Kirklin

Kirklin, who was awarded the Lister medal for his contributions to surgical science, and is especially known for his refinement of the heart-lung machine, knew the importance of marrying two fundamental ideas: technology and medical training.

In just a decade since Kirklin’s death, the advances in surgical training, and the testing of devices used by surgeons, have been rapid. On June 3, 2014, the U.S. Food and Drug Administration (FDA) approved one of the first simulation technologies to use advanced 3D capabilities and augmented reality (AR) for medical training and device testing.

The FDA’s decision represents a global paradigm shift taking place within medical research and regulatory frameworks. This shift is due to growing demand for solutions to a plethora of issues facing the health care community.

Medical professionals, specifically surgeons, are increasingly faced with a scarcity of hours in operating room environments. Additionally, the current state of low-fidelity organ and tissue physics on most simulation systems presents a significant hurdle in successfully utilizing novel technologies in training.

Medical device researchers, meanwhile, face separate obstacles in the form of an uncertain and adversarial regulatory environment, few best practices for disruptive technologies, and prohibitive costs to development.

However, recent collaborative efforts between software designers, hardware manufacturers, and health care professionals have marked a turning point in the way of training and testing.
 

Understanding the environment

The concepts of VR and AR aren’t new; the first head-mounted display was built in 1968, allowing users to be placed into wireframe rooms, a rudimentary space constructed out of reference lines. Due to funding and technology constraints, the bulk of VR/AR development was, for many decades, isolated to military research, which by its nature was often classified.

With the advent of significantly more powerful microprocessors, motion-tracking, and highly intuitive 3D modeling, the barriers to real-world application of fully immersive virtual environments are disappearing.

The majority of surgical simulations currently used in medical institutions are large pieces of proprietary hardware that run 2D or 3D software simulations. Users are dependent on a combination of joysticks, keypads, and styluses for control of measurement and surgical tools.

However, the new generation of VR systems represented by viewing headgear like the Oculus Rift, marks a new direction toward taking gaming, research, and testing into complete virtual environments. The collaborative nature of the industry and the democratization of middleware tools have proven instrumental in fostering rapid development and affordable solutions for complex environments.

Software development kits (SDKs) are used to program a line of communication between different pieces of hardware, such as the Oculus Rift (headwear) and SoftKinetic, which uses a 3D Time of Flight chip to track gestures and generate a virtual version of a user’s hands and limbs. This is arguably one of the largest steps forward in creating accurate digit articulation.

For the first time, users can immerse themselves in detailed and relevant simulated environments. These are populated by highly accurate 3D equipment, devices, and organ models – all of which respond as they would in real-world, physical environments based on users’ interactions with them.
 

Opportunities for change

The learning and practice of minimally invasive surgery (MIS) makes unique demands on surgical training programs, and surgeons have always faced serious limitations in real-world instruction and operating room hours. Residents require experience in fine motor coordination, device operation, and the randomness inherent in such procedures.

Meanwhile, the demonstration of the safety and efficacy of new medical devices is a long, arduous, and expensive path from concept to clinical practice.

Some of the opportunities offered by the new breed of VR are reflected by the perspective offered within the environment.

HealthcareFX, a Houston-based company, is pioneering the development of training for interbody fusion from a lateral approach, using a completely immersive environment. Not only can users be instructed anywhere in the world for a fraction of the normal cost of surgical training, but medical device developers are able to display and highlight functions of certain tools in real-time.

The rapid rate of virtual reality (VR) and augmented reality (AR) research, combined with multiple peripheral devices and software, has broad-reaching ramifications for the health care industry. As traditional medical education and device research becomes cost prohibitive, many of the solutions offered by VR/AR become invaluable as efficient means to support training and development.

Human beings have a diagonal field of view (FOV) of roughly 180°. Contrast this with the majority of current medical device simulations that offer around 90° in-game field of view, meaning a user has effectively half of his real-world FOV. Looking through a fixed 50mm portrait camera lens would produce a similar effect because the user is looking through the character’s eyes through a flat monitor. The Oculus Rift, by contrast, occupies 110° diagonally, which is a major improvement that will likely expand with future iterations.

In laparoscopic procedures, such as hemostatic surgery, VR and advanced 3D modeling simulates an accurate interaction between a high-fidelity 3D device, such as a hemostatic stapler, and 3D organs. Accurately portraying these separate elements is crucially important in order to replicate proper tissue compression and hemostasis.

VR devices, coupled with rapid software developments, represent an opportunity for greatly reduced costs; investment would be in the thousands, as opposed to hundreds of thousands of dollars. Devices such as the Oculus Rift are portable, easily fabricated, and highly flexible in how they are implemented with other peripherals to customize virtual environments.

The cost-benefit advantage can also translate into medical device development. As the FDA begins to recognize 3D simulations as a viable platform, accurate environments that utilize specific medical and surgical devices could help improve the way companies demonstrate safety and efficacy. VR and AR systems are able to reduce the cost and time spent on physical prototyping before animal and human trials.
 

Overcoming challenges

Perhaps the most significant hurdle to highly immersive VR is the lack of comprehensive tactile response, otherwise known as haptic feedback. Without the use of customized VR gloves, it is extremely difficult to emulate physical sensation. Systems such as the UltraHaptics in the U.K. have made progress using ultrasound to minutely displace air around a user’s hand, effectively emulating the sensation of touch.

Three-dimensional modeling concerns such as high polygon counts and detailed textures are still a challenge, requiring very specific sets of multi-modal data to ensure that organs and tissue respond and deform accurately to stimuli within a virtual environment.

When using virtual head-gear, such as the Oculus Rift, a small percentage of users have experienced what some call simulator sickness, caused by a disparity between what the eye is seeing and what the inner ear is sensing. Some effective solutions have already been implemented and new versions of VR headsets are already being produced to remedy this.

These issues should not prevent medical device researchers and health care companies from investing in these technologies. Many of the challenges are already in the process of being solved. The more complex ones, such as haptic feedback, show promise and could represent viable solutions in the near future.

 

HealthcareFX
www.healthcarefx.com


About the author: Alastair Moore is a project director of software and media for HealthcareFX. He is a specialist in health care communications and disruptive technologies. Moore can be reached at 832.857.7657 or amoore@healthcarefx.com.

November December 2014
Explore the November December 2014 Issue

Check out more from this issue and find your next story to read.