|Event title:||CVS Research Talk: (Ross Maddox)|
Better listening through uninformative visual stimuli
|Affiliation:||U. Rochester, BME|
|Location:||269 Meliora Hall, Kresge Rm|
|Abstract:||Listening in the real world is a multisensory endeavor. When visual and auditory information are both are available, they are integrated to achieve the most accurate perception of complicated sensory scenes. A salient example of this integration is the ventriloquism effect, in which a visual stimulus "captures" the location of an auditory stimulus in a way that is well described by optimal integration of location information. But when a visual stimulus offers no task-relevant information to integrate, can it still affect auditory perception? In this talk we will discuss two examples from our research that show improvements in auditory spatial discrimination resulting from visual stimuli that are task-uninformative, and discuss possible underlying mechanisms.|
In the first experiment we used visual a visual primer stimulus to direct listeners' eye gaze while keeping their heads fixed. We found that auditory spatial discrimination improved when gaze was directed towards the auditory stimulus versus when it was not. We found no improvement when the spatial primer stimulus was auditory rather than visual, indicating that attention alone does not explain the results, and eye gaze is thus an essential factor.
In the second experiment, we presented listeners with two symmetrically lateralized auditory stimuli: a noise and a harmonic tone, and asked them to report which side the tone was on. Concurrent with the auditory stimuli we presented two small visual stimuli. In one condition, the azimuth of the two visual stimuli matched those of the two auditory stimuli, and in the control condition both visual stimuli were presented centrally. We saw an improvement in auditory discrimination in the match-location condition, even though the visual stimuli provided no information about the auditory task.