Read by QxMD icon Read

Brain and Language

Kali Woodruff Carr, Ahren B Fitzroy, Adam Tierney, Travis White-Schwoch, Nina Kraus
Speech communication involves integration and coordination of sensory perception and motor production, requiring precise temporal coupling. Beat synchronization, the coordination of movement with a pacing sound, can be used as an index of this sensorimotor timing. We assessed adolescents' synchronization and capacity to correct asynchronies when given online visual feedback. Variability of synchronization while receiving feedback predicted phonological memory and reading sub-skills, as well as maturation of cortical auditory processing; less variable synchronization during the presence of feedback tracked with maturation of cortical processing of sound onsets and resting gamma activity...
October 1, 2016: Brain and Language
Mackenzie E Fama, William Hayward, Sarah F Snider, Rhonda B Friedman, Peter E Turkeltaub
Many individuals with aphasia describe anomia with comments like "I know it but I can't say it." The exact meaning of such phrases is unclear. We hypothesize that at least two discrete experiences exist: the sense of (1) knowing a concept, but failing to find the right word, and (2) saying the correct word internally but not aloud (successful inner speech, sIS). We propose that sIS reflects successful lexical access; subsequent overt anomia indicates post-lexical output deficits. In this pilot study, we probed the subjective experience of anomia in 37 persons with aphasia...
September 29, 2016: Brain and Language
Erika Skoe, Lisa Brody, Rachel M Theodore
Research with developmental populations suggests that the maturational state of auditory brainstem encoding is linked to reading ability. Specifically, children with poor reading skills resemble biologically younger children with respect to their auditory brainstem responses (ABRs) to speech stimulation. Because ABR development continues into adolescence, it is possible that the link between ABRs and reading ability changes or resolves as the brainstem matures. To examine these possibilities, ABRs were recorded at varying presentation rates in adults with diverse, yet unimpaired reading levels...
September 29, 2016: Brain and Language
Paul F Sowman, Margaret Ryan, Blake W Johnson, Greg Savage, Stephen Crain, Elisabeth Harrison, Erin Martin, Hana Burianová
The cause of stuttering has many theoretical explanations. A number of research groups have suggested changes in the volume and/or function of the striatum as a causal agent. Two recent studies in children and one in adults who stutter (AWS) report differences in striatal volume compared that seen in controls; however, the laterality and nature of this anatomical volume difference is not consistent across studies. The current study investigated whether a reduction in striatal grey matter volume, comparable to that seen in children who stutter (CWS), would be found in AWS...
September 28, 2016: Brain and Language
Julie Conder, Julius Fridriksson, Gordon C Baylis, Cameron M Smith, Timothy W Boiteau, Amit Almor
It is commonly held that language is largely lateralized to the left hemisphere in most individuals, whereas spatial processing is associated with right hemisphere regions. In recent years, a number of neuroimaging studies have yielded conflicting results regarding the role of language and spatial processing areas in processing language about space (e.g., Carpenter, Just, Keller, Eddy, & Thulborn, 1999; Damasio et al., 2001). In the present study, we used sparse scanning event-related functional magnetic resonance imaging (fMRI) to investigate the neural correlates of spatial language, that is; language used to communicate the spatial relationship of one object to another...
September 27, 2016: Brain and Language
Kurt Steinmetzger, Stuart Rosen
Magneto- and electroencephalographic (M/EEG) signals in response to acoustically degraded speech have been examined by several recent studies. Unambiguously interpreting the results is complicated by the fact that speech signal manipulations affect acoustics and intelligibility alike. In the current EEG study, the acoustic properties of the stimuli were altered and the trials were sorted according to the correctness of the listeners' spoken responses to separate out these two factors. Firstly, more periodicity (i...
September 27, 2016: Brain and Language
Chiara Pastori, Stefano Francione, Federica Pelle, Marco de Curtis, Vadym Gnatkovsky
A quantitative method was developed to map cortical areas responsive to cognitive tasks during intracerebral stereo-EEG recording sessions in drug-resistant patients candidate for epilepsy surgery. Frequency power changes were evaluated with a computer-assisted analysis in 7 patients during phonemic fluency tasks. All patients were right-handed and were explored with depth electrodes in the dominant frontal lobe. We demonstrate that fluency tasks enhance beta-gamma frequencies and reduce background activities in language network regions of the dominant hemisphere...
September 26, 2016: Brain and Language
Mathias Scharinger, Ulrike Domahs, Elise Klein, Frank Domahs
Research in auditory neuroscience illustrated the importance of superior temporal sulcus (STS) for speech sound processing. However, evidence for abstract processing beyond the level of phonetics in STS has remained elusive. In this study, we follow an underspecification approach according to which the phonological representation of vowels is based on the presence vs. absence of abstract features. We hypothesized that phonological mismatch in a same/different task is governed by underspecification: A less specified vowel in second position of same/different minimal pairs (e...
September 23, 2016: Brain and Language
Alice Foucart, Carlos Romero-Rivas, Bernharda Lottie Gort, Albert Costa
Using ERPs, we tested whether L2 speakers can integrate multiple sources of information (e.g., semantic, pragmatic information) during discourse comprehension. We presented native speakers and L2 speakers with three-sentence scenarios in which the final sentence was highly causally related, intermediately related, or causally unrelated to its context; its interpretation therefore required simple or complex inferences. Native speakers revealed a gradual N400-like effect, larger in the causally unrelated condition than in the highly related condition, and falling in-between in the intermediately related condition, replicating previous results...
September 21, 2016: Brain and Language
Ferdy Hubers, Tineke M Snijders, Helen de Hoop
Native speakers of Dutch do not always adhere to prescriptive grammar rules in their daily speech. These grammatical norm violations can elicit emotional reactions in language purists, mostly high-educated people, who claim that for them these constructions are truly ungrammatical. However, linguists generally assume that grammatical norm violations are in fact truly grammatical, especially when they occur frequently in a language. In an fMRI study we investigated the processing of grammatical norm violations in the brains of language purists, and compared them with truly grammatical and truly ungrammatical sentences...
September 14, 2016: Brain and Language
Veronika Rutar Gorišek, Vlasta Zupanc Isoski, Aleš Belič, Christina Manouilidou, Blaž Koritnik, Jure Bon, Nuška Pečarič Meglič, Matej Vrabec, Janez Žibert, Grega Repovš, Janez Zidar
Broca's region and adjacent cortex presumably take part in working memory (WM) processes. Electrophysiologically, these processes are reflected in synchronized oscillations. We present the first study exploring the effects of a stroke causing Broca's aphasia on these processes and specifically on synchronized functional WM networks. We used high-density EEG and coherence analysis to map WM networks in ten Broca's patients and ten healthy controls during verbal WM task. Our results demonstrate that a stroke resulting in Broca's aphasia also alters two distinct WM networks...
September 12, 2016: Brain and Language
Robert Harris, Klaus L Leenders, Bauke M de Jong
Parkinson's disease is characterized not only by bradykinesia, rigidity, and tremor, but also by impairments of expressive and receptive linguistic prosody. The facilitating effect of music with a salient beat on patients' gait suggests that it might have a similar effect on vocal behavior, however it is currently unknown whether singing is affected by the disease. In the present study, fifteen Parkinson patients were compared with fifteen healthy controls during the singing of familiar melodies and improvised melodic continuations...
September 9, 2016: Brain and Language
Sari Ylinen, Milla Huuskonen, Katri Mikkola, Emma Saure, Tara Sinkkonen, Petri Paavilainen
The brain is constantly generating predictions of future sensory input to enable efficient adaptation. In the auditory domain, this applies also to the processing of speech. Here we aimed to determine whether the brain predicts the following segments of speech input on the basis of language-specific phonological rules that concern non-adjacent phonemes. Auditory event-related potentials (ERP) were recorded in a mismatch negativity (MMN) paradigm, where the Finnish vowel harmony, determined by the first syllables of pseudowords, either constrained or did not constrain the phonological composition of pseudoword endings...
August 30, 2016: Brain and Language
Pascale Tremblay, Anthony Steven Dick
With the advancement of cognitive neuroscience and neuropsychological research, the field of language neurobiology is at a cross-roads with respect to its framing theories. The central thesis of this article is that the major historical framing model, the Classic "Wernicke-Lichtheim-Geschwind" model, and associated terminology, is no longer adequate for contemporary investigations into the neurobiology of language. We argue that the Classic model (1) is based on an outdated brain anatomy; (2) does not adequately represent the distributed connectivity relevant for language, (3) offers a modular and "language centric" perspective, and (4) focuses on cortical structures, for the most part leaving out subcortical regions and relevant connections...
August 29, 2016: Brain and Language
Nawal Abboub, Thierry Nazzi, Judit Gervain
Experience with spoken language starts prenatally, as hearing becomes operational during the second half of gestation. While maternal tissues filter out many aspects of speech, they readily transmit speech prosody and rhythm. These properties of the speech signal then play a central role in early language acquisition. In this study, we ask how the newborn brain uses variation in duration, pitch and intensity (the three acoustic cues that carry prosodic information in speech) to group sounds. In four near-infrared spectroscopy studies (NIRS), we demonstrate that perceptual biases governing how sound sequences are perceived and organized are present in newborns from monolingual and bilingual language backgrounds...
August 24, 2016: Brain and Language
A R Weighall, L M Henderson, D J Barr, S A Cairney, M G Gaskell
Lexical competition is a hallmark of proficient, automatic word recognition. Previous research suggests that there is a delay before a new spoken word becomes engaged in this process, with sleep playing an important role. However, data from one method - the visual world paradigm - consistently show competition without a delay. We trained 42 adults and 40 children (aged 7-8) on novel word-object pairings, and employed this paradigm to measure the time-course of lexical competition. Fixations to novel objects upon hearing existing words (e...
August 22, 2016: Brain and Language
Jona Sassenhagen, Phillip M Alday
Experimental research on behavior and cognition frequently rests on stimulus or subject selection where not all characteristics can be fully controlled, even when attempting strict matching. For example, when contrasting patients to controls, variables such as intelligence or socioeconomic status are often correlated with patient status. Similarly, when presenting word stimuli, variables such as word frequency are often correlated with primary variables of interest. One procedure very commonly employed to control for such nuisance effects is conducting inferential tests on confounding stimulus or subject characteristics...
August 17, 2016: Brain and Language
Felix Gervits, Sharon Ash, H Branch Coslett, Katya Rascovsky, Murray Grossman, Roy Hamilton
Primary progressive aphasia (PPA) is a neurodegenerative condition characterized by gradual deterioration of language function. We investigated whether two weeks of daily transcranial direct current stimulation (tDCS) treatment would improve language abilities in six people with a non-fluent form of PPA. tDCS was applied in an unblinded trial at an intensity of 1.5mA for 20min/day over 10days. At the time of stimulation, patients were engaged in narrating one of several children's wordless picture stories. A battery of neuropsychological assessments was administered four times: at baseline, immediately following the 2-week stimulation period, and then 6-weeks and 12-weeks following the end of stimulation...
August 11, 2016: Brain and Language
P J López-Peréz, J Dampuré, J A Hernández-Cabrera, H A Barber
During reading parafoveal information can affect the processing of the word currently fixated (parafovea-on-fovea effect) and words perceived parafoveally can facilitate their subsequent processing when they are fixated on (preview effect). We investigated parafoveal processing by simultaneously recording eye movements and EEG measures. Participants read word pairs that could be semantically associated or not. Additionally, the boundary paradigm allowed us to carry out the same manipulation on parafoveal previews that were displayed until reader's gaze moved to the target words...
August 8, 2016: Brain and Language
Adolfo M García, Facundo Carrillo, Juan Rafael Orozco-Arroyave, Natalia Trujillo, Jesús F Vargas Bonilla, Sol Fittipaldi, Federico Adolfi, Elmar Nöth, Mariano Sigman, Diego Fernández Slezak, Agustín Ibáñez, Guillermo A Cecchi
To assess the impact of Parkinson's disease (PD) on spontaneous discourse, we conducted computerized analyses of brief monologues produced by 51 patients and 50 controls. We explored differences in semantic fields (via latent semantic analysis), grammatical choices (using part-of-speech tagging), and word-level repetitions (with graph embedding tools). Although overall output was quantitatively similar between groups, patients relied less heavily on action-related concepts and used more subordinate structures...
August 5, 2016: Brain and Language
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"