Visualization of Functional Connectivity Networks using VR glasses.

Gonzalo Rojas 1, Jorge Fuentes 2, Carlos Montoya 2, María de la Iglesia-Vaya3,4, Marcelo Gálvez 2

1Laboratory of Medical Image Processing, Clinica las Condes, Santiago, Chile, 2 Department of Radiology, Clinica las Condes, Santiago, Chile, 3Centre of Excellence in Biomedical Image (CEIB), Regional Ministry of Health in the Valencia Region (CS), C./Micer Masco nº 31-33, Valencia 46010, Spain, 4Brain Connectivity Lab, Prince Felipe Research Centre (CIPF), C./Eduardo Primo Yúfera (Científic), nº 3, Valencia 46012, Spain

 

 

Introduction

 

The visualization of complex neuroimages such as tractography, functional connectivity and functional imaging, is a difficult task that imply how to show the targets of this parametrical neuroimages in a manner to offer good information to understand it. In previous works we had shown some visualization techniques for neuroimages (such as anaglyph, augmented reality, virtual reality using Cardboard, etc.; Rojas et al, 2014, Rojas et al., 2015, Rojas et al., 2016).

 

Virtual reality (VR) is an environment of scenes or real-life like objects, generated by computer technology, which create the feeling of being immersed in it. The environment is viewed by the user via a specific device usually known as glasses or virtual reality helmet. There are different types of VR devices: VR glasses (such as Oculus Rift, HTC Vive, Sony PlayStation VR, Samsung Gear VR, and Google Daydream View), and cheaper devices such as Google Cardboard (Google Inc.).

 

Here we describe a Windows based HTC Vive VR system that allows viewing seven Yeo functional connectivity networks (Yeo, 2011), with EEG 10-20 electrodes superimposed to that functional connectivity networks.

 

 

Methods

 

Using MNI152 2mm standard-space T1-weighted average structural template image we created a mesh using 3D Slicer 3.6.3 Grayscale Model Maker module (marching cubes; threshold: 5800, smooth: 50, decimate: 0.25). A seven Yeo Network Liberal Mask (surfer.nmr.mgh.harvard.edu/fswiki/CorticalParcellation_Yeo2011) was used to create the mesh model of the seven standard Yeo functional networks (using 3D Slicer Model Maker module, smooth: 10, filter type: Sinc, decimate: 0.25, split normals, point normals, pad). HC Laplacian smooth algorithm (MeshLab v 1.2.2) was used to smooth brain mesh model and the mesh of each 7 Yeo networks. The hippocampus and corpus callosum mesh were created with 3D Slicer using MNI152 mm standard-space T1 image segmented using Freesurfer v 5.3

 

The Windows software was created using C# language and software tools: Unity 5, (graphic engine; www.unity3d.com), 3DsMax (mesh and materials; www.autodesk.com), Gimp (textures; www.gimp.org).

 

 

Results

 

We created a Windows-based VR system that uses HTC Vive glasses.

 

The user views a 3D transparent brain fused with a functional connectivity network selected by the user among the seven standard networks (Yeo, 2011). The network may be chosen using the computer keyboard, and it could be zoom-in or zoom-out by pressing the left and right arrows of the keyboard. The corpus callosum and hippocampus could be added to the visualization by pressing the “c” or “h” keyboard keys respectively.

 

The application also shows the functional and anatomical description of each functional connectivity network.

 

 

Conclusions

 

The system we are described here is an example of the use of the latest technology in VR glasses (HTC Vive) to visualize complex functional connectivity networks. It might be possible to adapt further this system to show other neuroimages techniques such as tractography.

 

This system has potential use as an academic tool because it shows the position in the cortex in a 3D environment of different regions of each functional network with immersive experience (depth and breadth of the brain).

 

If we compare VRiBraiN (VR Android-based application using Cardboard devices; Rojas et al., 2016) with the HTC-Vive based visualization solution described here:

  1. I) the virtual reality solution using cardboard (Rojas et al., 2016), is cheaper, and therefore more accessible than virtual reality glasses HTC-Vive

Ii) the system using HTC-Vive, allows a more advanced control for rotating or zoom the brain (using a joystick or another peripheral), than the cardboard-based application VRiBraiN (Rojas et al., 2016).

 

 

References

 

Rojas, G.M., Gálvez, M., Vega Potler, N., et al. (2014). ‘Stereoscopic three-dimensional visualization applied to multimodal brain images: clinical applications and a functional connectivity atlas’, Front Neurosci 8: 328. doi: 10.3389/fnins.2014.00328.

 

Rojas, G.M., Fuentes, J., Gálvez, M., Margulies, D.S. (2015). ‘Augmented Reality rsfc-MRI

Visualization Aplication: ARiBraiN3T, iBraiN, iBraiNEEG: New Versions’, 21th

Annual Meeting of the Organization for Human Brain Mapping, Honolulu, Hawaii,

USA, June 2015.

 

Rojas, G.M., Fuentes, J., Montoya, C., de la Iglesia-Vayá, M., Gálvez, M. (2016). ‘Virtual Reality Intrinsic Functional Connectivity Visualization Application: VRiBraiN’. 22th Annual Meeting of the Organization for Human Brain Mapping, Geneva, Switzerland, June 2016.

 

Yeo, B.T., Krienen, F.M., Sepulcre, J., et al. (2011). ‘The organization of the human cerebral cortex estimated by intrinsic functional connectivity’, J Neurophysiol 106(3):1125-65.

 


Download Full Poster 


 

You may also like...