keyword
MENU ▼
Read by QxMD icon Read
search

Audiovisual speech perception

keyword
https://www.readbyqxmd.com/read/30541680/audiovisual-speech-perception-and-language-acquisition-in-preterm-infants-a-longitudinal-study
#1
Masahiro Imafuku, Masahiko Kawai, Fusako Niwa, Yuta Shinya, Masako Myowa
BACKGROUND: Preterm infants have a higher risk of language delay throughout childhood. The ability to integrate audiovisual speech information is associated with language acquisition in term infants; however, the relation is still unclear in preterm infant. AIM AND METHODS: This study longitudinally investigated visual preference for audiovisual congruent and incongruent speech during a preferential looking task using eye-tracking in preterm and term infants at 6, 12, and 18 months of corrected age...
December 8, 2018: Early Human Development
https://www.readbyqxmd.com/read/30458524/spontaneous-otoacoustic-emissions-reveal-an-efficient-auditory-efferent-network
#2
Viorica Marian, Tuan Q Lam, Sayuri Hayakawa, Sumitrajit Dhar
Purpose: Understanding speech often involves processing input from multiple modalities. The availability of visual information may make auditory input less critical for comprehension. This study examines whether the auditory system is sensitive to the presence of complementary sources of input when exerting top-down control over the amplification of speech stimuli. Method: Auditory gain in the cochlea was assessed by monitoring spontaneous otoacoustic emissions (SOAEs), which are by-products of the amplification process...
November 8, 2018: Journal of Speech, Language, and Hearing Research: JSLHR
https://www.readbyqxmd.com/read/30418995/what-accounts-for-individual-differences-in-susceptibility-to-the-mcgurk-effect
#3
Violet A Brown, Maryam Hedayati, Annie Zanger, Sasha Mayn, Lucia Ray, Naseem Dillman-Hasso, Julia F Strand
The McGurk effect is a classic audiovisual speech illusion in which discrepant auditory and visual syllables can lead to a fused percept (e.g., an auditory /bɑ/ paired with a visual /gɑ/ often leads to the perception of /dɑ/). The McGurk effect is robust and easily replicated in pooled group data, but there is tremendous variability in the extent to which individual participants are susceptible to it. In some studies, the rate at which individuals report fusion responses ranges from 0% to 100%. Despite its widespread use in the audiovisual speech perception literature, the roots of the wide variability in McGurk susceptibility are largely unknown...
2018: PloS One
https://www.readbyqxmd.com/read/30404498/leveraging-audiovisual-speech-perception-to-measure-anticipatory-coarticulation
#4
Melissa A Redford, Jeffrey E Kallay, Sergei V Bogdanov, Eric Vatikiotis-Bateson
A noninvasive method for accurately measuring anticipatory coarticulation at experimentally defined temporal locations is introduced. The method leverages work in audiovisual (AV) speech perception to provide a synthetic and robust measure that can be used to inform psycholinguistic theory. In this validation study, speakers were audio-video recorded while producing simple subject-verb-object sentences with contrasting object noun rhymes. Coarticulatory resistance of target noun onsets was manipulated as was metrical context for the determiner that modified the noun...
October 2018: Journal of the Acoustical Society of America
https://www.readbyqxmd.com/read/30391756/modality-independent-recruitment-of-inferior-frontal-cortex-during-speech-processing-in-human-infants
#5
Nicole Altvater-Mackensen, Tobias Grossmann
Despite increasing interest in the development of audiovisual speech perception in infancy, the underlying mechanisms and neural processes are still only poorly understood. In addition to regions in temporal cortex associated with speech processing and multimodal integration, such as superior temporal sulcus, left inferior frontal cortex (IFC) has been suggested to be critically involved in mapping information from different modalities during speech perception. To further illuminate the role of IFC during infant language learning and speech perception, the current study examined the processing of auditory, visual and audiovisual speech in 6-month-old infants using functional near-infrared spectroscopy (fNIRS)...
November 2018: Developmental Cognitive Neuroscience
https://www.readbyqxmd.com/read/30303762/cross-modal-phonetic-encoding-facilitates-the-mcgurk-illusion-and-phonemic-restoration
#6
Noelle M Abbott, Antoine J Shahin
In spoken language, audiovisual (AV) perception occurs when the visual modality influences encoding of acoustic features (e.g., phonetic representations) at the auditory cortex. We examined how visual speech (lip-movements) transforms phonetic representations, indexed by changes to the N1 auditory evoked potential (AEP). EEG was acquired while human subjects watched and listened to videos of a speaker uttering consonant vowel (CV) syllables, /ba/ and /wa/, presented in auditory-only, AV congruent or incongruent contexts, or in a context in which the consonants were replaced by white noise (noise-replaced)...
October 10, 2018: Journal of Neurophysiology
https://www.readbyqxmd.com/read/30262826/multisensory-perception-reflects-individual-differences-in-processing-temporal-correlations
#7
Aaron R Nidiffer, Adele Diederich, Ramnarayan Ramachandran, Mark T Wallace
Sensory signals originating from a single event, such as audiovisual speech, are temporally correlated. Correlated signals are known to facilitate multisensory integration and binding. We sought to further elucidate the nature of this relationship, hypothesizing that multisensory perception will vary with the strength of audiovisual correlation. Human participants detected near-threshold amplitude modulations in auditory and/or visual stimuli. During audiovisual trials, the frequency and phase of auditory modulations were varied, producing signals with a range of correlations...
September 27, 2018: Scientific Reports
https://www.readbyqxmd.com/read/30249803/the-motor-network-reduces-multisensory-illusory-perception
#8
Takenobu Murakami, Mitsunari Abe, Winnugroho Wiratman, Juri Fujiwara, Masahiro Okamoto, Tomomi Mizuochi-Endo, Toshiki Iwabuchi, Michiru Makuuchi, Akira Yamashita, Amanda Tiksnadi, Fang-Yu Chang, Hitoshi Kubo, Nozomu Matsuda, Shunsuke Kobayashi, Satoshi Eifuku, Yoshikazu Ugawa
Observing mouth movements has strikingly effects on the perception of speech. Any mismatch between sound and mouth movements will result in listeners perceiving illusory consonants (McGurk effect), whereas matching mouth movements assist with the correct recognition of speech sounds. Recent neuroimaging studies have yielded evidence that the motor areas are involved in speech processing, yet their contributions to multisensory illusion remain unclear. Using functional magnetic resonance imaging (fMRI) and transcranial magnetic stimulation (TMS) in an event-related design, we aimed to identify the functional roles of the motor network in the occurrence of multisensory illusion in female and male brains...
September 24, 2018: Journal of Neuroscience: the Official Journal of the Society for Neuroscience
https://www.readbyqxmd.com/read/30131578/single-trial-plasticity-in-evidence-accumulation-underlies-rapid-recalibration-to-asynchronous-audiovisual-speech
#9
David M Simon, Aaron R Nidiffer, Mark T Wallace
Asynchronous arrival of audiovisual information at the peripheral sensory organs is a ubiquitous property of signals in the natural environment due to differences in the propagation time of light and sound. As these cues are constantly changing their distance from the observer, rapid adaptation to asynchronies is crucial for their appropriate integration. We investigated the neural basis of rapid recalibration to asynchronous audiovisual speech in humans using a combination of psychophysics, drift diffusion modeling, and electroencephalography (EEG)...
August 21, 2018: Scientific Reports
https://www.readbyqxmd.com/read/30043353/brief-report-differences-in-multisensory-integration-covary-with-sensory-responsiveness-in-children-with-and-without-autism-spectrum-disorder
#10
Jacob I Feldman, Wayne Kuang, Julie G Conrad, Alexander Tu, Pooja Santapuram, David M Simon, Jennifer H Foss-Feig, Leslie D Kwakye, Ryan A Stevenson, Mark T Wallace, Tiffany G Woynaroski
Research shows that children with autism spectrum disorder (ASD) differ in their behavioral patterns of responding to sensory stimuli (i.e., sensory responsiveness) and in various other aspects of sensory functioning relative to typical peers. This study explored relations between measures of sensory responsiveness and multisensory speech perception and integration in children with and without ASD. Participants were 8-17 year old children, 18 with ASD and 18 matched typically developing controls. Participants completed a psychophysical speech perception task, and parents reported on children's sensory responsiveness...
July 24, 2018: Journal of Autism and Developmental Disorders
https://www.readbyqxmd.com/read/29990508/adult-dyslexic-readers-benefit-less-from-visual-input-during-audiovisual-speech-processing-fmri-evidence
#11
Ana A Francisco, Atsuko Takashima, James M McQueen, Mark van den Bunt, Alexandra Jesse, Margriet A Groen
The aim of the present fMRI study was to investigate whether typical and dyslexic adult readers differed in the neural correlates of audiovisual speech processing. We tested for Blood Oxygen-Level Dependent (BOLD) activity differences between these two groups in a 1-back task, as they processed written (word, illegal consonant strings) and spoken (auditory, visual and audiovisual) stimuli. When processing written stimuli, dyslexic readers showed reduced activity in the supramarginal gyrus, a region suggested to play an important role in phonological processing, but only when they processed strings of consonants, not when they read words...
August 2018: Neuropsychologia
https://www.readbyqxmd.com/read/29888819/electrocorticography-reveals-continuous-auditory-and-visual-speech-tracking-in-temporal-and-occipital-cortex
#12
Cristiano Micheli, Inga M Schepers, Müge Ozker, Daniel Yoshor, Michael S Beauchamp, Jochem W Rieger
During natural speech perception, humans must parse temporally continuous auditory and visual speech signals into sequences of words. However, most studies of speech perception present only single words or syllables. We used electrocorticography (subdural electrodes implanted on the brains of epileptic patients) to investigate the neural mechanisms for processing continuous audiovisual speech signals consisting of individual sentences. Using partial correlation analysis, we found that posterior superior temporal gyrus (pSTG) and medial occipital cortex tracked both the auditory and the visual speech envelopes...
June 11, 2018: European Journal of Neuroscience
https://www.readbyqxmd.com/read/29867686/visual-speech-perception-cues-constrain-patterns-of-articulatory-variation-and-sound-change
#13
Jonathan Havenhill, Youngah Do
What are the factors that contribute to (or inhibit) diachronic sound change? While acoustically motivated sound changes are well-documented, research on the articulatory and audiovisual-perceptual aspects of sound change is limited. This paper investigates the interaction of articulatory variation and audiovisual speech perception in the Northern Cities Vowel Shift (NCVS), a pattern of sound change observed in the Great Lakes region of the United States. We focus specifically on the maintenance of the contrast between the vowels /ɑ/ and /ɔ/, both of which are fronted as a result of the NCVS...
2018: Frontiers in Psychology
https://www.readbyqxmd.com/read/29867415/audiovisual-temporal-perception-in-aging-the-role-of-multisensory-integration-and-age-related-sensory-loss
#14
REVIEW
Cassandra J Brooks, Yu Man Chan, Andrew J Anderson, Allison M McKendrick
Within each sensory modality, age-related deficits in temporal perception contribute to the difficulties older adults experience when performing everyday tasks. Since perceptual experience is inherently multisensory, older adults also face the added challenge of appropriately integrating or segregating the auditory and visual cues present in our dynamic environment into coherent representations of distinct objects. As such, many studies have investigated how older adults perform when integrating temporal information across audition and vision...
2018: Frontiers in Human Neuroscience
https://www.readbyqxmd.com/read/29862161/shifts-in-audiovisual-processing-in-healthy-aging
#15
Sarah H Baum, Ryan Stevenson
Purpose of Review: The integration of information across sensory modalities into unified percepts is a fundamental sensory process upon which a multitude of cognitive processes are based. We review the body of literature exploring aging-related changes in audiovisual integration published over the last five years. Specifically, we review the impact of changes in temporal processing, the influence of the effectiveness of sensory inputs, the role of working memory, and the newer studies of intra-individual variability during these processes...
September 2017: Current Behavioral Neuroscience Reports
https://www.readbyqxmd.com/read/29780312/combining-behavioral-and-erp-methodologies-to-investigate-the-differences-between-mcgurk-effects-demonstrated-by-cantonese-and-mandarin-speakers
#16
Juan Zhang, Yaxuan Meng, Catherine McBride, Xitao Fan, Zhen Yuan
The present study investigated the impact of Chinese dialects on McGurk effect using behavioral and event-related potential (ERP) methodologies. Specifically, intra-language comparison of McGurk effect was conducted between Mandarin and Cantonese speakers. The behavioral results showed that Cantonese speakers exhibited a stronger McGurk effect in audiovisual speech perception compared to Mandarin speakers, although both groups performed equally in the auditory and visual conditions. ERP results revealed that Cantonese speakers were more sensitive to visual cues than Mandarin speakers, though this was not the case for the auditory cues...
2018: Frontiers in Human Neuroscience
https://www.readbyqxmd.com/read/29758189/audiovisual-perception-in-amblyopia-a-review-and-synthesis
#17
REVIEW
Michael D Richards, Herbert C Goltz, Agnes M F Wong
Amblyopia is a common developmental sensory disorder that has been extensively and systematically investigated as a unisensory visual impairment. However, its effects are increasingly recognized to extend beyond vision to the multisensory domain. Indeed, amblyopia is associated with altered cross-modal interactions in audiovisual temporal perception, audiovisual spatial perception, and audiovisual speech perception. Furthermore, although the visual impairment in amblyopia is typically unilateral, the multisensory abnormalities tend to persist even when viewing with both eyes...
May 17, 2018: Experimental Eye Research
https://www.readbyqxmd.com/read/29751619/language-experience-changes-audiovisual-perception
#18
Viorica Marian, Sayuri Hayakawa, Tuan Q Lam, Scott R Schroeder
Can experience change perception? Here, we examine whether language experience shapes the way individuals process auditory and visual information. We used the McGurk effect—the discovery that when people hear a speech sound (e.g., “ba”) and see a conflicting lip movement (e.g., “ga”), they recognize it as a completely new sound (e.g., “da”). This finding suggests that the brain fuses input across auditory and visual modalities demonstrates that what we hear is profoundly influenced by what we see...
May 11, 2018: Brain Sciences
https://www.readbyqxmd.com/read/29740294/converging-evidence-from-electrocorticography-and-bold-fmri-for-a-sharp-functional-boundary-in-superior-temporal-gyrus-related-to-multisensory-speech-processing
#19
Muge Ozker, Daniel Yoshor, Michael S Beauchamp
Although humans can understand speech using the auditory modality alone, in noisy environments visual speech information from the talker's mouth can rescue otherwise unintelligible auditory speech. To investigate the neural substrates of multisensory speech perception, we compared neural activity from the human superior temporal gyrus (STG) in two datasets. One dataset consisted of direct neural recordings (electrocorticography, ECoG) from surface electrodes implanted in epilepsy patients (this dataset has been previously published)...
2018: Frontiers in Human Neuroscience
https://www.readbyqxmd.com/read/29705718/neural-networks-supporting-audiovisual-integration-for-speech-a-large-scale-lesion-study
#20
Gregory Hickok, Corianne Rogalsky, William Matchin, Alexandra Basilakos, Julia Cai, Sara Pillay, Michelle Ferrill, Soren Mickelsen, Steven W Anderson, Tracy Love, Jeffrey Binder, Julius Fridriksson
Auditory and visual speech information are often strongly integrated resulting in perceptual enhancements for audiovisual (AV) speech over audio alone and sometimes yielding compelling illusory fusion percepts when AV cues are mismatched, the McGurk-MacDonald effect. Previous research has identified three candidate regions thought to be critical for AV speech integration: the posterior superior temporal sulcus (STS), early auditory cortex, and the posterior inferior frontal gyrus. We assess the causal involvement of these regions (and others) in the first large-scale (N = 100) lesion-based study of AV speech integration...
June 2018: Cortex; a Journal Devoted to the Study of the Nervous System and Behavior
keyword
keyword
159901
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"