Virtual Reality Anatomy (ViRA)
of the Hand and Forearm
by Andrea S. Kim
How can virtual reality technology be combined with the user’s real-time movements to create an interactive application providing robust and immersive 3D visualizations of the muscular and skeletal structures of the forearm and hand?
The primary objective of this research study is to explore the development and design of such an interactive VR application, Virtual Reality Anatomy (ViRA). The ViRA application tracks the user’s physical hand and forearm in real-time and superimposes selected 3D anatomical structures onto the user’s virtual hand and forearm, providing concurrent visual feedback. The development of ViRA was explored in Unity for use with the Leap Motion Controller (LMC) on a 3D monitor desktop, Windows Mixed Reality Headsets (HP and Lenovo Explorer), and the Oculus Rift. Within ViRA’s responsive virtual space, users can toggle the visibility of individual musculoskeletal structures to view them in groups or as isolated structures from various angles through direct manipulation of their own body, as I am showcasing in this video.
Winner of 2019 Serious Games and Virtual Environments Best in Show Award, Student Category
2019 SimVentors Showcase, San Antonio, TX
International Meeting on Simulation Healthcare (IMSH)
presented by the Society for Simulation in Healthcare
My Masters thesis was selected for presentation at the SimVentors Showcase in January 2019. I unfortunately did not attend in-person. My workplace was unable to process travel/funding during the longest government shutdown in US history.
I was, however, able to video-call into the SimVentors Showcase as my research advisor, Dr. Cristian Luciano, gave the demos on my behalf.
< here I am interacting from my room
a peak into the process:
The virtual scene and interactive spatial user interface was scripted in C# using the game engine Unity for cross-platform development.
3D models of the hand and forearm were obtained from Kristen Browne at the National Institute of Health (NIH), and were based from the male model in NLM's Visible Human Project
Autodesk 3D Studio Max, a 3D modeling and rendering software, was used for morphing, position adjustments, and manual blend-weighting of vertices in the rigging of the mesh for integration with the Leap Motion’s Hands Module 2.0, an automated pipeline for connecting 3D models to Unity for the Leap Motion Orion Beta.
Pixologic ZBrush, a digital 3D sculpting and painting tool, was used to paint, sculpt, and reduce polygon counts of the anatomical 3D models for real-time rendering and overall performance optimization of the application.
Prior to export, a UV map was generated to translate the surface of the 3D mesh into 2D images. Conveniently, the ZBrush plug-in UV master can automatically generate a UV map from any subdivision level. After a UV map has been generated for each anatomical structure at its lowest subdivision, the UVs can then be re-projected at the highest subdivision to ensure matching maps after baking. Accordingly, the retopologized, low-poly mesh of each individual muscular and skeletal structures was exported as an .OBJ file at its lowest subdivision level from ZBrush to keep a manageable total polycount for the Unity VR scene.
Then, to obtain the desired level of detail for the models, the high-poly mesh of each 3D structure was baked into a diffuse map (upper left image) and a normal map (upper right image) to imitate a higher-quality appearance.
The process can be visualized here for the dorsal interossei muscles, which was then repeated for all the musculoskeletal structures showcased in ViRA.