Show simple item record

dc.contributor.advisorWeisenberger, Janet M.
dc.creatorCrystal , Huffman
dc.description.abstractCommunication between two people involves collecting and integrating information from different senses. An example in speech perception is when a listener relies on auditory inputs to hear spoken words and on visual input to read lips, making it easier to communicate in a noisy environment. Listeners are able to make use of visual cues to fill in missing auditory information when the auditory signal has been compromised in some way (e.g., hearing loss or noisy environment). Interestingly, listeners integrate auditory and visual information during the perception of speech, even when one of those senses proves to be more than sufficient. Grant and Seitz (1998) found a great deal of variability in the performance of listeners on perception tasks of auditory-visual speech. These discoveries have posed a number of questions about why and how multi-sensory integration occurs. Research in “optimal integration” suggests the possibility that listener, talker, or acoustic characteristics may influence auditory-visual integration. The present study focused on characteristics of the auditory signal that might promote auditory-visual integration, specifically looking at whether removal of information from the signal would produce greater use of the visual input and thus greater integration. CVC syllables from 5 talkers were degraded by selectively removing spectral fine-structure but maintaining temporal envelope characteristics of the waveform. The resulting stimuli were output through 2-.4-, 6-, and 8-channel bandpass filters. Results for 10 normal-hearing listeners showed auditory-visual integration for all conditions, but the amount of integration did not vary across different auditory signal manipulations. In addition, substantial across-talker differences were observed in auditory intelligibility in the 2-channel condition. Interestingly, the degree of audiovisual integration produced by different talkers was unrelated to auditory intelligibility. Implications of these results for our understanding of the processes underlying auditory-visual integration are discussed. Advisor: Janet M. Weisenbergeren
dc.description.sponsorshipArts and Sciences Collegiate Undergraduate Scholarshipen
dc.description.sponsorshipSocial and Behavioral Sciences Undergraduate Research Scholarshipen
dc.format.extent187580 bytes
dc.publisherThe Ohio State Universityen
dc.relation.ispartofseriesThe Ohio State University. Department of Speech and Hearing Science Honors Theses; 2007en
dc.subjectAudiovisual Integrationen
dc.subjectBandpass Filterbandsen
dc.subjectOptimal Integrationen
dc.titleThe role of auditory information in audiovisual speech integrationen

Files in this item


Items in Knowledge Bank are protected by copyright, with all rights reserved, unless otherwise indicated.

This item appears in the following Collection(s)

Show simple item record