You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

The quest to apply VR technology to rehabilitation: tribulations and treasures

Abstract

The papers that follow stem from a symposium presented at the International Society for Posture and Gait Research (ISPGR) in Seville, Spain, in July 2015. Four speakers were charged with presenting their methods of applying virtual reality (VR) technology to obtain meaningful rehabilitation outcomes. The symposium aims to explore characteristics of VR that modify mechanisms supporting motor relearning. Common impairments in posture and gait that can be modulated within virtual environments by employing motor learning concepts, including sensory augmentation and repetition, were examined. Critical overviews of VR applications that address different therapeutic objectives for improving posture and gait in individuals with neurological insult or injury were presented. A further goal was to identify approaches and efforts to bridge the gap between knowledge generation from research and knowledge uptake in clinical practice. Specific objectives of this symposium were that participants be able to: 1) identify benefits and limitations of selecting VR as an intervention tool; 2) discuss how VR relates to principles for motor relearning following neurological insult or injury; and 3) identify areas and methods for future translation of VR technology in clinical and home-based settings. Our symposium concluded that the application of VR technology in assessment, treatment, and research has yielded promising results in transferring learned cognitive and motor skills to more natural environments. VR permits the user to interact with a multidimensional and multisensory environment in real time, and offers the opportunity to provide both standardized and individualized interventions while monitoring behavior.

VR technology has rapidly become a popular approach to physical and psychological interventions that require an individual to actively participate in their environment [1, 7, 23]. The most popular form of this technology for intervention is focused on the development of games or environments that require particular movement behaviors or provide positive reinforcement to encourage repetition [21]. Early application of VR technology in assessment, treatment, and research has yielded promising results for transferring cognitive and motor skills acquired in virtual environments to more realistic and natural environments. VR permits the user to interact with a multidimensional and multisensory environment in real time, and offers the opportunity to provide both standardized and individualized interventions while monitoring the resulting behaviors. However, if we are to influence more automatic behaviors such as locomotion and balance, we need to understand how technology development interfaces with human performance and how therapeutic interventions can be adapted to employ the technology effectively.

Traditional approaches to vestibular diagnosis and intervention have emerged primarily from our understanding of the predominant reflex pathways for eye and head stabilization. A long history of anatomical and physiological investigations has provided us with a reasonable understanding on how the labyrinthine signals govern the motions of the eyes with respect to the head, and the head with respect to the body. As far back as 1912, Magnus and DeKleijn described the summative effects of vestibulocollic and cervicocollic reflexes to orient the head in space relative to the position of the body [24]. The progression of this research from decerebrate to intact mammals revealed contributions from other body segments and redundant mechanisms supporting the process of postural stabilization [11]. It has now become apparent that the central processing of postural stabilization and orientation in space is dependent upon the contributions of multiple sensory pathways as well as the goal and demands of the specific motor task [27].

The premise that control of posture and orientation is the result of a linear summation of two or three inputs (i.e., proprioceptive, vestibular, and, perhaps, visual) is too simplistic [10]. Posture and orientation behaviors exhibit characteristics attributed to the processes of sensorimotor integration. There is evidence that responses to more than one input modality are non-additive [3, 13]. Functional outputs are modified by sensory information from both the body and the environment [5]. Changes in postural orientation can produce a change in perception of the environment [20]. These behaviors are also modulated by both intention-based and stimulus-based actions (feedforward and feedback) [9] indicative of ascending as well as descending control processes. The presence of redundant control mechanisms permits a successful outcome with response variability and task dependence. Therefore, acknowledging the redundant control mechanisms for posture and orientation implies that the principles governing sensorimotor relearning can also be applied to interventions and assessments for impairments of gait and balance control.

Locomotion involves the integration of multiple sources of sensory information (e.g. vestibular, proprioceptive, visual), together with the motor command, to generate a finely coordinated locomotion adapted to contextual demands. The review paper by Rhea and Kuznetsov highlights the essential role of the vestibular system in providing the sensory information required to stabilize head orientation during walking, ensure gaze stability, and to engage lower body musculature to maintain postural stability. Vestibular dysfunction can impact on the control of local and global gait variables resulting in diverse gait alterations, such as increased trunk sway and decreased gait speed, respectively. Rhea and Kuzetnov use motor learning principles to create an innovative visual feedback protocol that can be incorporated easily with VR technology to manipulate the fractal variation of stepping. The idea of constructing a visual metronome with avatars to optimize implicit learning to enhance gait control is not only tantalizing but also feasible with the recent advances of computer engineering. Exciting evidence suggested a reorganization of the neuromotor behavior had occurred, with the newly emerged fractal pattern retained even after the feedback was removed [22].

Another creative use of VR technology in gait rehabilitation is the manipulation of steering or heading control. The control of heading direction is a requirement for goal-directed locomotion or moving in the desired direction while avoiding obstacles along the path. Optic flow, together with the perceived goal direction, is one of the visual cues used to control heading direction while walking [25, 29]. When walking straight, the optic flow describes a radial pattern of expansion with a focus of expansion located in the direction of heading [29]. When a change of direction is needed, the optic flow theory stipulates that the focus of expansion is realigned with the desired heading direction [6]. In support of this theory, it has been shown that offsetting the focus of expansion of optic flow with prisms or through VR causes healthy individuals to veer from their initial trajectory when instructed to walk straight ahead [8, 25]. Lamontagne has created a unique paradigm to simulate different optic flows in a virtual environment [2]. The research paper by Hanna, Fung and Lamontagne provided evidence that non-visual cues, including vestibular and head-neck somatosensory information, are required for proper control of a straight walking trajectory.

Early studies identifying the mechanisms generating postural reactions revealed that the presence or absence of vision had little effect on the organization of the postural response. Dynamic visual feedback, however, modified the postural behaviors that were highly correlated with the frequency and amplitude of the visual scene [4, 15]. Thus, in natural environments, visual signals greatly impact the postural orientation of an individual. Keshner and colleagues [12] applied these findings in the development of their virtual environment to demonstrate that the dynamics of the visual field differentially influenced postural behaviors. They observed that the visual environment exerted much greater power when combined with support surface disturbances, and the support surface disturbance exerted greater power over head and trunk sways when combined with visual flow [13].

These results have significant implications for measurement of postural activity and the important role of the virtual environment in research and rehabilitation. The adaptive nature of the human nervous system makes it imperative that we test and train individuals in conditions as close as possible to those commonly encountered during daily activities. This concept is supported in the paper by Wright, Tierney and McDevitt who present strong evidence that currently accepted clinical tools used to assess damage to the vestibular system are inadequate when identifying the subtle signs indicating long term sequelae of a mild TBI or concussion. This group compares the results of several popular and conventional assessments of postural control with results from their newly developed assessment tool that explores the response to combined postural and visual demands using a VR system. Their results are indicative of impaired processing of combined visual-vestibular demands following mild TBI and advocate for diagnostic tools that present more complex processing demands on the CNS than the established tests of automatic and reflex behaviors.

Models of motor learning and relearning have identified several factors as essential to the production of effective motion with changing environmental circumstances. Sensory feedback from the world is noisy, and the relationship between a motor command and the movement it produces is variable and dependent on the momentary state of the body and the environment [5]. VR technology allows for continuous, dynamic delivery of sensory feedback during the course of a movement that can be graded to match or augment what would occur in the natural environment. In addition, our ability to predict the effectiveness of a response depends on the continual calibration of internal models in the CNS that predict sensory consequences and adjust our responses to sensory error through motor adaptation [26]. An intrinsic sensory disparity exists between our egocentric feedback and visual and haptic feedback provided by the virtual environment. Unless augmented with haptic gloves and tactile sensors, objects in the virtual environment have no force feedback. When manipulating objects in the environment, oftentimes the reference frame for the object (e.g., the computer monitor) is not matched to the spatial coordinates of the upper limb. This spatial mismatch would need to be calculated and adjusted for in order to produce a successful motion.

Sensory mismatch presented in the virtual world may be manipulated to support and shape motor adaptation. For example, it has been shown that when standing quietly, optic flow of the virtual world will modify postural orientation and sway in direct relation to the direction and velocity of the visual world [12, 28]. As shown in the model (Fig. 1) proposed by Miall and Wolpert [18], even engaging in a game in the virtual world can alter motor control processes if the actual sensory feedback from the performer’s actions does not match the predicted sensory feedback. Any mismatch will result in a discrepancy that will then transform errors between the desired and actual sensory outcome of a movement into the corresponding errors in the motor command, thereby providing the appropriate signals for motor learning.

Fig.1

Model of forward and inverse learning in physical and virtual environments, adapted from Miall and Wolpert [25].

Model of forward and inverse learning in physical and virtual environments, adapted from Miall and Wolpert [25].

There is ample evidence that training in the VR environment supports and facilitates sensorimotor relearning for posture and balance [1, 14, 16, 19, 30]. One of the most cited properties of VR for success of rehabilitation is the motivation it provides to the performer to engage in repetition and learning. Incorporating the cognitive and perceptual functions of motor performance with the performance of the actual motor skill will enhance motor learning and task transfer [19]. The paper by Kizony, Zeilig, Krasovsky, Bondi, Weiss, Kodesh and Kafri demonstrates how VR can be used to augment motor performance in an ecologically valid environment to provide meaningful practice and repetition that will transfer into the physical world. By simulating a shopping mall and asking both young and older adults to navigate through the mall while on a treadmill to complete specific shopping tasks, these investigators were able to incorporate training for the real world demands of this task that include executive decision-making and planning functions.

The emergence of VR as a rehabilitation technology requires that we determine how it can best be used to support and modify human behavior. A significant strength of VR is that we can incorporate many of the motor learning principles required for successful rehabilitation. There is great potential for VR technology to provide opportunities to target motor learning principles based on variables of practice, augmented feedback, motivation, and observational learning. VR has also been shown to be particularly motivating for consistent repetition. A weakness in the use of this easily acquired technology is that many clinicians use it to motivate clients to exercise without identifying or controlling for the actual cognitive and neuromuscular parameters that could be modified by these activities [31]. The papers that follow in this special issue discuss how impairment and improvement in motor behavior can be measured in a VR environment, the strengths and weaknesses of available VR systems that are available for use in clinical settings, and the decisions employed when developing a VR system targeted for clinical intervention.

Acknowledgments

We would like to thank Chris Rhea, Anouk Lamontagne, Geoffrey Wright and Rachel Kizony for their participation at the ISPGR symposium in 2015 (Seville, Spain) and for their contribution to the review and research papers that follow.

References

[1] 

Adamovich S.V. , et al., Sensorimotor training in virtual reality: A review. Neuro Rehabilitation 25: ((2009) ), 29–44.

[2] 

Berard J.R. , Fung J. and Lamontagne A. , Evidence for the use of rotational optic flow cues for locomotor steering in healthy older adults. J Neurophysiol 106: ((2011) ), 1089–1096.

[3] 

Bugnariu N. and Fung J. , Aging and selective sensorimotor strategies in the regulation of upright balance. J Neuroeng Rehabil 20: ((2007) ), 4–19.

[4] 

Dichgans J. and Brandt T. , Visual-vestibular interaction: Effects on self-motion perception and postural control. In: Perception, Held R., et al., eds., Springer, Berlin (1978) , 755–804.

[5] 

Dokka K. , Kenyon R.V. and Keshner E.A. , Influence of visual motion and support surface cues on segmental orientation. Gait Posture 30: ((2009) ), 211–216.

[6] 

Gibson J.J. , The visual perception of objective motion and subjective movement. Psychol Rev 101: ((1994) ), 318–323.

[7] 

Grynszpan O. , et al., Innovative technology-based interventions for autism spectrum disorders: A meta-analysis. Autism 184: ((2014) ), 346–361.

[8] 

Harris J.M. and Bonas W. , Optic flow and scene structure do not always contribute to the control of human walking. Vision Res 42: ((2002) ), 1619–1926.

[9] 

Herwig A. , Prinz W. and Waszak F. , Two modes of sensorimotor integration in intention-based and stimulus-based actions. Q J Exp Psychol (Hove) 60: (11) ((2007) ), 1540–1554.

[10] 

Johansson R. , et al., Multi-stimulus multi-response posturography. Math Biosci 174: ((2001) ), 41–59.

[11] 

Keshner E.A. , et al., Patterns of neck muscle activation in cats during reflex and voluntary head movements. Exp Brain Res 88: ((1992) ), 361–374.

[12] 

Keshner E.A. and Kenyon R.V. , The influence of an immersive virtual enviornment on the segmental organization of postural stabilizing responses. J Vestib Res 10: ((2000) ), 201–219.

[13] 

Keshner E.A. , Kenyon R.V. and Langston J. , Postural responses exhibit intra-modal dependencies with discordant visual and support surface motion. J Vestib Res 14: ((2004) ), 307–319.

[14] 

Keshner E.A. and Weiss P.L. , Accessing new technologies for Teaching and Intervention. Editors, Special Issue of. Journal of Physical Therapy Education 25: ((2011) ), 3–4.

[15] 

Lestienne F. , Soechting J. and Berthoz A. , Postural readjustments induced by linear motion of visual scenes. Exp Brain Res 28: ((1977) ), 363–384.

[16] 

Levin M.F. , Weiss P.L. and Keshner E.A. , Emergence of VR as a tool for upper limb rehabilitation. Phys Ther 95: ((2015) ), 415–425.

[17] 

Meldrum D. , et al., Effectiveness of conventional versus virtual reality-based balance exercises in vestibular rehabilitation for unilateral peripheral vestibular loss: Results of a randomised controlled trial. Arch Phys Med Rehabil 96: (7) ((2015) ), 1319–1328.

[18] 

Miall R.C. and Wolpert D.M. , Forward models for physiological motor control. Neural Networks 9: ((1996) ), 1265–1279.

[19] 

Molina K.I. , et al., Virtual reality using games for improving physical functioning in older adults: A systematic review. J Neuroeng Rehabil 11: ((2014) ), 156.

[20] 

Previc F.H. , The effects of dynamic visual stimulation on perception and motor control. J Vestib Res 2: ((1992) ), 285–295.

[21] 

Proffitt R. and Lange B. , Considerations in the efficacy and effectiveness of virtual reality interventions for stroke rehabilitation: Moving the field forward. Phys Ther 95: ((2015) ), 441–448.

[22] 

Rhea C.K. , et al., Fractal gait patterns are retained after entrainment to a fractal stimulus. PLoS One 9: ((2014) ), e106755.

[23] 

Rizzo A. , et al., Virtual reality applications for addressing the needs of those aging with disability. Stud Health Technol Inform 163: ((2011) ), 510–516.

[24] 

Russell B.W. , Case of hemiplegia with epileptiform convulsions; exhibiting tonic innervation and tonic neck reflex of Magnus and de Kleijn. Proc R Soc Med 19: ((1926) ), 10.

[25] 

Sarre G. , et al., Steering behaviour can be modulated by different optic flows during walking. Neurosci Lett 436: ((2008) ), 96–101.

[26] 

Shadmehr R. , Smith M.A. and Krakauer J.W. , Error correction, sensory prediction, and adaptation in motor control. Ann Rev Neurosci 33: ((2010) ), 89–108.

[27] 

Shumway-Cook A. and Woollacott M. , Attentional demands and postural control: The effect of sensory context. J Gerontol A Biol Sci Med Sci 55: ((2000) ), M10–M16.

[28] 

Wang Y. , Kenyon R.V. and Keshner E.A. , Identifying the control of physically and perceptually evoked sway responses with coincident visual scene velocities and tilt of the base of support. Exp Brain Res 20: ((2010) ), 663–672.

[29] 

Warren W.H. Jr , et al., Optic flow is used to control human walking. Nat Neurosci 4: ((2001) ), 213–216.

[30] 

Wright W.G. , et al., Sensorimotor recalibration in virtual environments. In: Virtual Reality Technologies for Health and Clinical Applications (Series editor: Sharkey P.): Vol. 1: : Applying Virtual Reality Technologies to Motor Rehabilitation, Weiss P.L., Keshner E.A. and Levin M.F., eds, Springer, NY, (2014) , pp. 71–94.

[31] 

Weiss P.L. , Keshner E.A. and Levin M.F. , Current and Future Trends for VR and Motor Rehabilitation. In Virtual Reality Technologies for Health and Clinical Applications (Series editor: Sharkey P.): Vol. 1: : Applying Virtual Reality Technologies to Motor Rehabilitation, Weiss P.L., Keshner E.A. and Levin M.F., eds, Springer, NY, pp. 1–4.