Holt, N. (2006). Perceptual lateralization - audiovisual stimuli. PHILICA.COM Observation number 9.
Perceptual lateralization - audiovisual stimuli

Nigel Holtconfirmed userThis person has donated to Philica (Psychology, Bath Spa University)

Published in psycho.philica.com

Variance in spatial lateralization judgements of audiovisual stimuli with spatio-temporally corresponding components was less than for auditory or visual stimuli, suggesting that judgements were not based solely on stimulus properties of one modal component. Lateralization of audiovisual stimuli with spatially non-corresponding components showed a visual bias (c.f. Radeau and Bertelson 1977) but standard deviations increased as a function of audiovisual spatial and temporal mismatch. An influence of the position of the auditory component of audiovisual stimuli on spatio-temporally non-corresponding components was shown. The results are discussed in terms of factors influencing the perception of these simple stimulus ensembles as a single audiovisual event — the unity assumption — rather than as separate auditory and visual events.

Observation circumstances
In preparation for a series of experiments, a pilot. I’m preparing the full publication for Philica now.

Information about this Observation
Peer-review ratings as of 19:41:17 on 18th Jan 2018 (from 1 review, where a score of 100 is average):
Originality = 156.25, importance = 156.25, overall quality = 156.25

Published on Sunday 14th May, 2006 at 07:18:27.

Creative Commons License
This work is licensed under a Creative Commons Attribution 2.5 License.
The full citation for this Observation is:
Holt, N. (2006). Perceptual lateralization - audiovisual stimuli. PHILICA.COM Observation number 9.

Peer review added 2nd November, 2006 at 20:08:21

It is important to specify the kinds of information available that permits the holistic and unified perception of an audio-visual event. For example, for hammer banging, ball bounding, person walking, or hands clapping, each has several types of information (tempo, rhythm, synchrony) that allow perceivers to detect a united auditory and visual event. The degree of spatial mis-match, and the amount of temporal mismatch (for non-corresponding events), as well as temporal invariants such as tempo, rhythm, synchrony, and intensity, must all be carefully operationalized and manipulated to see how these influence visual or auditory bias in spatial lateralization. It would be of interest to see the effects of spatial mismatch on perception of face-voice coherence (since temporal mismatch alone is so salient) and because social stimuli may be differently processed than non-social objects/events. David J. Lewkowicz has published work on infants’ perception of audio-visual compounds and the early emergence of multisensory perception and auditory-visual matching in the absence of spatial cues.

Website copyright © 2006-07 Philica; authors retain the rights to their work under this Creative Commons License and reviews are copyleft under the GNU free documentation license.
Using this site indicates acceptance of our Terms and Conditions.

This page was generated in 0.0874 seconds.