Modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a ‘virtual reality check’

Georg F. Meyer, Fei Shao, Mark D. White, Carl Hopkins, Antony J. Robotham

Research output: Contribution to journalArticlepeer-review

Abstract

Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.
Original languageEnglish
Pages (from-to)1-15
Number of pages0
JournalPLoS ONE
Volume8
Issue number6
DOIs
Publication statusPublished - 28 Jun 2013

Fingerprint

Dive into the research topics of 'Modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a ‘virtual reality check’'. Together they form a unique fingerprint.

Cite this