synchronized audio-visual transients drive efficient visual search for motion-in-depth同步的视听瞬变驱动高效的motion-in-depth视觉搜索.pdf
文本预览下载声明
Synchronized Audio-Visual Transients Drive Efficient
Visual Search for Motion-in-Depth
1,2 3 1,2 4
Marina Zannoli *, John Cass , Pascal Mamassian , David Alais
´ ´
1 Universite Paris Descartes, Sorbonne Paris Cite, Paris, France, 2 Laboratoire Psychologie de la Perception, CNRS UMR 8158, Paris, France, 3 School of Psychology,
University of Western Sydney, Sydney, New South Wales, Australia, 4 School of Psychology, University of Sydney, Sydney, New South Wales, Australia
Abstract
In natural audio-visual environments, a change in depth is usually correlated with a change in loudness. In the present
study, we investigated whether correlating changes in disparity and loudness would provide a functional advantage in
binding disparity and sound amplitude in a visual search paradigm. To test this hypothesis, we used a method similar to
that used by van der Burg et al. to show that non-spatial transient (square-wave) modulations of loudness can drastically
improve spatial visual search for a correlated luminance modulation. We used dynamic random-dot stereogram displays to
produce pure disparity modulations. Target and distractors were small disparity-defined squares (either 6 or 10 in total).
Each square moved back and forth in depth in front of the background plane at different phases. The target’s depth
modulation was synchronized with an amplitude-modulated auditory tone. Visual and auditory modulations were always
congruent (both sine-wave or square-wave). In a speeded search task, five observers were asked to identify the target as
quickly as possible. Results show a significant improvement in visual search times in the square-wave condition
显示全部