Multisensory integration with a head-mounted display: Background visual motion and sound motion

Harrison, William J., Thompson, Matthew B. and Sanderson, Penelope M. (2010) Multisensory integration with a head-mounted display: Background visual motion and sound motion. Human Factors, 52 1: 78-91. doi:10.1177/0018720810367790


Author Harrison, William J.
Thompson, Matthew B.
Sanderson, Penelope M.
Title Multisensory integration with a head-mounted display: Background visual motion and sound motion
Journal name Human Factors   Check publisher's open access policy
ISSN 0018-7208
1547-8181
Publication date 2010-02
Sub-type Article (original research)
DOI 10.1177/0018720810367790
Volume 52
Issue 1
Start page 78
End page 91
Total pages 14
Place of publication Santa Monica, CA, United States
Publisher Sage
Collection year 2011
Language eng
Formatted abstract
Objective:
The aim of this study was to assess how background visual motion and the relative movement of sound affect a head-mounted display (HMD) wearer’s performance at a task requiring integration of auditory and visual information.

Background:
HMD users are often mobile. A commercially available speaker in a fixed location delivers auditory information affordably to the HMD user. However, previous research has shown that mobile HMD users perform poorly at tasks that require integration of visual and auditory information when sound comes from a free-field speaker. The specific cause of the poor task performance is unknown.

Method:
Participants counted audiovisual events that required integration of sounds delivered via a free-field speaker and vision on an HMD. Participants completed the task while either walking around a room, sitting in the room, or sitting inside a mobile room that allowed separate manipulation of background visual motion and speaker motion.

Results:
Participants’ accuracy at counting target audiovisual events was worse when participants were walking than when sitting at a desk, p = .032. Compared with when they were sitting at a desk, participants’ accuracy at counting target audiovisual events showed a trend to be worse when they experienced a combination of background visual motion and the relative movement of sound, p = .058. Conclusion: Multisensory integration performance is least effective when HMD users experience a combination of background visual motion and relative movement of sound. Eye reflexes may play an important role.

Application: 
Results apply to situations in which HMD wearers are mobile when receiving multimodal information, as in health care and military contexts.
Copyright © 2010, Human Factors and Ergonomics Society.

Keyword Self-motion
Temporal-order
Attention
Performance
Judgment
Demands
Pursuit
Q-Index Code C1
Q-Index Status Confirmed Code
Institutional Status UQ

Document type: Journal Article
Sub-type: Article (original research)
Collections: Official 2011 Collection
School of Information Technology and Electrical Engineering Publications
School of Psychology Publications
 
Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 6 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 7 times in Scopus Article | Citations
Google Scholar Search Google Scholar
Created: Sun, 18 Jul 2010, 00:06:46 EST