keyword
MENU ▼
Read by QxMD icon Read
search

perceptual categorization

keyword
https://www.readbyqxmd.com/read/29764263/categorical-perception-of-lexical-tones-in-native-mandarin-speaking-listeners-with-sensorineural-hearing-loss
#1
Beier Qi, Peng Liu, Xin Gu, Ruijuan Dong, Bo Liu
BACKGROUND: Categorical perception (CP) of lexical tones was examined in normal hearing (NH) people, but it was unclear whether lexical tones can be perceived categorically in sensorineural hearing loss (SNHL) people. OBJECTIVES: To explore the characteristic of lexical tone perception in native Mandarin speakers with SNHL. MATERIALS AND METHODS: Three types of continuum (Tone1/Tone2, Tone1/Tone4 and Tone2/Tone3) were constructed and each of them includes 15 stimuli which were resynthesized by applying the pitch-synchronous overlap and add (PSOLA) method implemented in Praat to the same Mandarin syllable, /a/, with a high-level tone produced by a female speaker...
May 15, 2018: Acta Oto-laryngologica
https://www.readbyqxmd.com/read/29752660/classification-errors-and-response-times-over-multiple-distributed-sessions-as-a-function-of-category-structure
#2
Derek E Zeigler, Ronaldo Vigo
Learning difficulty orderings for categorical stimuli have long provided an empirical foundation for concept learning and categorization research. The conventional approach seeks to determine learning difficulty orderings in terms of mean classification accuracy. However, it is relatively rare that the stability of such orderings is tested over a period of extended learning. Further, research rarely explores dependent variables beyond classification accuracy that may also indicate relative learning difficulty, such as classification response times (RTs)...
May 11, 2018: Memory & Cognition
https://www.readbyqxmd.com/read/29745711/rapid-visual-perception-of-interracial-crowds-racial-category-learning-from-emotional-segregation
#3
Sarah Ariel Lamer, Timothy D Sweeny, Michael Louis Dyer, Max Weisbuch
Drawing from research on social identity and ensemble coding, we theorize that crowd perception provides a powerful mechanism for social category learning. Crowds include allegiances that may be distinguished by visual cues to shared behavior and mental states, providing perceivers with direct information about social groups and thus a basis for learning social categories. Here, emotion expressions signaled group membership: to the extent that a crowd exhibited emotional segregation (i.e., was segregated into emotional subgroups), a visible characteristic (race) that incidentally distinguished emotional subgroups was expected to support categorical distinctions...
May 2018: Journal of Experimental Psychology. General
https://www.readbyqxmd.com/read/29726058/visual-search-and-autism-symptoms-what-young-children-search-for-and-co-occurring-adhd-matter
#4
Brianna R Doherty, Tony Charman, Mark H Johnson, Gaia Scerif, Teodora Gliga
Superior visual search is one of the most common findings in the autism spectrum disorder (ASD) literature. Here, we ascertain how generalizable these findings are across task and participant characteristics, in light of recent replication failures. We tested 106 3-year-old children at familial risk for ASD, a sample that presents high ASD and ADHD symptoms, and 25 control participants, in three multi-target search conditions: easy exemplar search (look for cats amongst artefacts), difficult exemplar search (look for dogs amongst chairs/tables perceptually similar to dogs), and categorical search (look for animals amongst artefacts)...
May 3, 2018: Developmental Science
https://www.readbyqxmd.com/read/29702161/racial-bias-in-empathy-do-we-process-dark-and-fair-colored-hands-in-pain-differently-an-eeg-study
#5
Sarah Fabi, Hartmut Leuthold
The aim of this study was to identify racial bias influences on empathic processing from early stimulus encoding, over categorization until late motor processing stages by comparing brain responses (electroencephalogram) to pictures of fair- and dark-colored hands in painful or neutral daily-life situations. Participants performed a pain judgment task and a skin color judgment task. Event-related brain potentials (ERPs) substantiated former findings of automatic empathic influences on stimulus encoding, reflected by the early posterior negativity (EPN), and late controlled influences on the stimulus categorization, as reflected by the late posterior positivity (P3b)...
April 24, 2018: Neuropsychologia
https://www.readbyqxmd.com/read/29681473/tracing-the-trajectory-of-sensory-plasticity-across-different-stages-of-speech-learning-in-adulthood
#6
Rachel Reetzke, Zilong Xie, Fernando Llanos, Bharath Chandrasekaran
Although challenging, adults can learn non-native phonetic contrasts with extensive training [1, 2], indicative of perceptual learning beyond an early sensitivity period [3, 4]. Training can alter low-level sensory encoding of newly acquired speech sound patterns [5]; however, the time-course, behavioral relevance, and long-term retention of such sensory plasticity is unclear. Some theories argue that sensory plasticity underlying signal enhancement is immediate and critical to perceptual learning [6, 7]. Others, like the reverse hierarchy theory (RHT), posit a slower time-course for sensory plasticity [8]...
April 18, 2018: Current Biology: CB
https://www.readbyqxmd.com/read/29673606/aging-barriers-influencing-mobile-health-usability-for-older-adults-a-literature-based-framework-mold-us
#7
REVIEW
G A Wildenbos, Linda Peute, Monique Jaspers
BACKGROUND: With the growing population of older adults as a potential user group of mHealth, the need increases for mHealth interventions to address specific aging characteristics of older adults. The existence of aging barriers to computer use is widely acknowledged. Yet, usability studies show that mHealth still fails to be appropriately designed for older adults and their expectations. To enhance designs of mHealth aimed at older adult populations, it is essential to gain insight into aging barriers that impact the usability of mHealth as experienced by these adults...
June 2018: International Journal of Medical Informatics
https://www.readbyqxmd.com/read/29673483/training-humans-to-categorize-monkey-calls-auditory-feature-and-category-selective-neural-tuning-changes
#8
Xiong Jiang, Mark A Chevillet, Josef P Rauschecker, Maximilian Riesenhuber
Grouping auditory stimuli into common categories is essential for a variety of auditory tasks, including speech recognition. We trained human participants to categorize auditory stimuli from a large novel set of morphed monkey vocalizations. Using fMRI-rapid adaptation (fMRI-RA) and multi-voxel pattern analysis (MVPA) techniques, we gained evidence that categorization training results in two distinct sets of changes: sharpened tuning to monkey call features (without explicit category representation) in left auditory cortex and category selectivity for different types of calls in lateral prefrontal cortex...
April 18, 2018: Neuron
https://www.readbyqxmd.com/read/29670545/effect-of-the-nu-age-diet-on-cognitive-functioning-in-older-adults-a-randomized-controlled-trial
#9
Anna Marseglia, Weili Xu, Laura Fratiglioni, Cristina Fabbri, Agnes A M Berendsen, Agata Bialecka-Debek, Amy Jennings, Rachel Gillings, Nathalie Meunier, Elodie Caumon, Susan Fairweather-Tait, Barbara Pietruszka, Lisette C P G M De Groot, Aurelia Santoro, Claudio Franceschi
Background: Findings from animal and epidemiological research support the potential neuroprotective benefits from healthy diets. However, to establish diet-neuroprotective causal relations, evidence from dietary intervention studies is needed. NU-AGE is the first multicenter intervention assessing whether a diet targeting health in aging can counteract the age-related physiological changes in different organs, including the brain. In this study, we specifically investigated the effects of NU-AGE's dietary intervention on age-related cognitive decline...
2018: Frontiers in Physiology
https://www.readbyqxmd.com/read/29657743/rapid-recalibration-of-speech-perception-after-experiencing-the-mcgurk-illusion
#10
Claudia S Lüttke, Alexis Pérez-Bellido, Floris P de Lange
The human brain can quickly adapt to changes in the environment. One example is phonetic recalibration: a speech sound is interpreted differently depending on the visual speech and this interpretation persists in the absence of visual information. Here, we examined the mechanisms of phonetic recalibration. Participants categorized the auditory syllables /aba/ and /ada/, which were sometimes preceded by the so-called McGurk stimuli (in which an /aba/ sound, due to visual /aga/ input, is often perceived as 'ada')...
March 2018: Royal Society Open Science
https://www.readbyqxmd.com/read/29653135/are-red-yellow-green-and-blue-perceptual-categories
#11
Christoph Witzel, Karl R Gegenfurtner
This study investigated categorical perception for unique hues in order to establish a relationship between color appearance, color discrimination, and low-level (second-stage) mechanisms. We tested whether pure red, yellow, green, and blue (unique hues) coincide with troughs, and their transitions (binary hues) with peaks of sensitivity in DKL-space. Results partially confirmed this idea: JNDs demarcated perceptual categories at the binary hues around green, blue and less clearly around yellow, when colors were isoluminant with the background and when accounting for the overall variation of sensitivity by fitting an ellipse...
April 10, 2018: Vision Research
https://www.readbyqxmd.com/read/29604674/the-effect-of-sentential-context-on-phonetic-categorization-is-modulated-by-talker-accent-and-exposure
#12
Jessamyn Schertz, Kara Hawthorne
Higher-level factors, including the contextual plausibility of competing word candidates, interact with lower-level phonetic cues to influence how listeners interpret the speech signal. This work shows that listeners' phonetic categorization (e.g., coat versus goat) is more heavily influenced by sentential context when listening to a non-native versus native talker. Further, the effect of context on phonetic categorization decreases as the listener becomes familiar with the talker's phonetic characteristics, for both native and non-native talkers...
March 2018: Journal of the Acoustical Society of America
https://www.readbyqxmd.com/read/29593602/bouba-kiki-in-touch-associations-between-tactile-perceptual-qualities-and-japanese-phonemes
#13
Maki Sakamoto, Junji Watanabe
Several studies have shown cross-modal associations between sounds and vision or gustation by asking participants to match pre-defined sound-symbolic words (SSWs), such as "bouba" or "kiki," with visual or gustatory materials. Here, we conducted an explorative study on cross-modal associations of tactile sensations using spontaneous production of Japanese SSWs and semantic ratings. The Japanese language was selected, because it has a large number of SSWs that can represent a wide range of tactile perceptual spaces with fine resolution, and it shows strong associations between sound and touch...
2018: Frontiers in Psychology
https://www.readbyqxmd.com/read/29581379/visual-mismatch-and-predictive-coding-a-computational-single-trial-erp-study
#14
Gabor Stefanics, Jakob Heinzle, András Attila Horváth, Klaas Enno Stephan
Predictive coding (PC) posits that the brain employs a generative model to infer the environmental causes of its sensory data and uses precision-weighted prediction errors (pwPE) to continuously update this model. While supported by much circumstantial evidence, experimental tests grounded in formal trial-by-trial predictions are rare. One partial exception are event-related potential (ERP) studies of the auditory mismatch negativity (MMN), where computational models have found signatures of pwPEs and related model-updating processes...
March 26, 2018: Journal of Neuroscience: the Official Journal of the Society for Neuroscience
https://www.readbyqxmd.com/read/29562838/different-aspects-of-facial-affect-recognition-impairment-following-traumatic-brain-injury-the-role-of-perceptual-and-interpretative-abilities
#15
Arianna Rigon, Michelle W Voss, Lyn S Turkstra, Bilge Mutlu, Melissa C Duff
It is well established that many individuals with traumatic brain injury (TBI) are impaired at facial affect recognition, yet little is known about the mechanisms underlying such deficits. In particular, little work has examined whether the breakdown of facial affect recognition abilities occurs at the perceptual level (e.g., recognizing a smile) or at the verbal categorization stage (e.g., assigning the label "happy" to a smiling face). The aim of the current study was to investigate the integrity of these two distinct facial affect recognition subskills in a sample of 38 individuals with moderate-to-severe TBI and 24 demographically matched healthy individuals...
March 22, 2018: Journal of Clinical and Experimental Neuropsychology
https://www.readbyqxmd.com/read/29545587/neural-basis-for-categorical-boundaries-in-the-primate-pre-sma-during-relative-categorization-of-time-intervals
#16
Germán Mendoza, Juan Carlos Méndez, Oswaldo Pérez, Luis Prado, Hugo Merchant
Perceptual categorization depends on the assignment of different stimuli to specific groups based, in principle, on the notion of flexible categorical boundaries. To determine the neural basis of categorical boundaries, we record the activity of pre-SMA neurons of monkeys executing an interval categorization task in which the limit between short and long categories changes between blocks of trials within a session. A large population of cells encodes this boundary by reaching a constant peak of activity close to the corresponding subjective limit...
March 15, 2018: Nature Communications
https://www.readbyqxmd.com/read/29532328/the-neuroscience-of-perceptual-categorization-in-pigeons-a-mechanistic-hypothesis
#17
REVIEW
Onur Güntürkün, Charlotte Koenen, Fabrizio Iovine, Alexis Garland, Roland Pusch
We are surrounded by an endless variation of objects. The ability to categorize these objects represents a core cognitive competence of humans and possibly all vertebrates. Research on category learning in nonhuman animals started with the seminal studies of Richard Herrnstein on the category "human" in pigeons. Since then, we have learned that pigeons are able to categorize a large number of stimulus sets, ranging from Cubist paintings to English orthography. Strangely, this prolific field has largely neglected to also study the avian neurobiology of categorization...
March 12, 2018: Learning & Behavior
https://www.readbyqxmd.com/read/29504800/auditory-affective-processing-requires-awareness
#18
Mikko Lähteenmäki, Jaakko Kauramäki, Disa A Sauter, Lauri Nummenmaa
Recent work has challenged the previously widely accepted belief that affective processing does not require awareness and can be carried out with more limited resources than semantic processing. This debate has focused exclusively on visual perception, even though evidence from both human and animal studies suggests that existence for nonconscious affective processing would be physiologically more feasible in the auditory system. Here we contrast affective and semantic processing of nonverbal emotional vocalizations under different levels of awareness in three experiments, using explicit (two-alternative forced choice masked affective and semantic categorization tasks, Experiments 1 and 2) and implicit (masked affective and semantic priming, Experiment 3) measures...
March 5, 2018: Emotion
https://www.readbyqxmd.com/read/29502144/fast-periodic-stimulation-fps-a-highly-effective-approach-in-fmri-brain-mapping
#19
Xiaoqing Gao, Francesco Gentile, Bruno Rossion
Defining the neural basis of perceptual categorization in a rapidly changing natural environment with low-temporal resolution methods such as functional magnetic resonance imaging (fMRI) is challenging. Here, we present a novel fast periodic stimulation (FPS)-fMRI approach to define face-selective brain regions with natural images. Human observers are presented with a dynamic stream of widely variable natural object images alternating at a fast rate (6 images/s). Every 9 s, a short burst of variable face images contrasting with object images in pairs induces an objective face-selective neural response at 0...
March 3, 2018: Brain Structure & Function
https://www.readbyqxmd.com/read/29497221/human-non-linguistic-vocal-repertoire-call-types-and-their-meaning
#20
Andrey Anikin, Rasmus Bååth, Tomas Persson
Recent research on human nonverbal vocalizations has led to considerable progress in our understanding of vocal communication of emotion. However, in contrast to studies of animal vocalizations, this research has focused mainly on the emotional interpretation of such signals. The repertoire of human nonverbal vocalizations as acoustic types, and the mapping between acoustic and emotional categories, thus remain underexplored. In a cross-linguistic naming task (Experiment 1), verbal categorization of 132 authentic (non-acted) human vocalizations by English-, Swedish- and Russian-speaking participants revealed the same major acoustic types: laugh, cry, scream, moan, and possibly roar and sigh...
2018: Journal of Nonverbal Behavior
keyword
keyword
50412
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"