keyword
MENU ▼
Read by QxMD icon Read
search

auditory scene

keyword
https://www.readbyqxmd.com/read/28217264/eight-essential-foods-in-iranian-traditional-medicine-and-their-role-in-health-promotion-and-well-being
#1
REVIEW
Mehrdad Zeinalian, Mehdi Eshaghi, Mahdi Hadian, Homayoun Naji, Sayed Mohammad Masoud Marandi, Sedigheh Asgary
Eight essential foods (EEF) described in Iranian traditional medicine (ITM) have a determinant role to balance human temperament insuring health and well-being. EEF included oral, imaginary, auditory, visual, olfactory, touch, sexual, and familiarity food. Oral foods should be halal, compatible with individual temper, consumed up twice a day, and compatible with different seasons and geographic conditions. Imaginary food consists of the individual thought content which is directly related to mental and physical fitness...
2017: International Journal of Preventive Medicine
https://www.readbyqxmd.com/read/28199022/recent-advances-in-exploring-the-neural-underpinnings-of-auditory-scene-perception
#2
Joel S Snyder, Mounya Elhilali
Studies of auditory scene analysis have traditionally relied on paradigms using artificial sounds-and conventional behavioral techniques-to elucidate how we perceptually segregate auditory objects or streams from each other. In the past few decades, however, there has been growing interest in uncovering the neural underpinnings of auditory segregation using human and animal neuroscience techniques, as well as computational modeling. This largely reflects the growth in the fields of cognitive neuroscience and computational neuroscience and has led to new theories of how the auditory system segregates sounds in complex arrays...
February 15, 2017: Annals of the New York Academy of Sciences
https://www.readbyqxmd.com/read/28185087/attention-to-body-parts-varies-with-visual-preference-and-verb-effector-associations
#3
Ty W Boyer, Josita Maouene, Nitya Sethuraman
Theories of embodied conceptual meaning suggest fundamental relations between others' actions, language, and our own actions and visual attention processes. Prior studies have found that when people view an image of a neutral body in a scene they first look toward, in order, the head, torso, hands, and legs. Other studies show associations between action verbs and the body-effectors used in performing the action (e.g., "jump" with feet/legs; "talk" with face/head). In the present experiment, the visual attention of participants was recorded with a remote eye-tracking system while they viewed an image of an actor pantomiming an action and heard a concrete action verb...
February 9, 2017: Cognitive Processing
https://www.readbyqxmd.com/read/28147594/subjective-perceptual-organization-of-a-complex-auditory-scene
#4
Sabine Thomassen, Alexandra Bendixen
Empirical research on the sequential decomposition of an auditory scene primarily relies on interleaved sound mixtures of only two tone sequences (e.g., ABAB…). This oversimplifies the sound decomposition problem by limiting the number of putative perceptual organizations. The current study used a sound mixture composed of three different tones (ABCABC…) that could be perceptually organized in many different ways. Participants listened to these sequences and reported their subjective perception by continuously choosing one out of 12 visually presented perceptual organization alternatives...
January 2017: Journal of the Acoustical Society of America
https://www.readbyqxmd.com/read/28141523/the-plausibility-of-a-string-quartet-performance-in-virtual-reality
#5
Ilias Bergstrom, Sergio Azevedo, Panos Papiotis, Nuno Saldanha, Mel Slater
We describe an experiment that explores the contribution of auditory and other features to the illusion of plausibility in a virtual environment that depicts the performance of a string quartet. 'Plausibility' refers to the component of presence that is the illusion that the perceived events in the virtual environment are really happening. The features studied were: Gaze (the musicians ignored the participant, the musicians sometimes looked towards and followed the participant's movements), Sound Spatialization (Mono, Stereo, Spatial), Auralization (no sound reflections, reflections corresponding to a room larger than the one perceived, reflections that exactly matched the virtual room), and Environment (no sound from outside of the room, birdsong and wind corresponding to the outside scene)...
January 27, 2017: IEEE Transactions on Visualization and Computer Graphics
https://www.readbyqxmd.com/read/28102912/different-spatio-temporal-eeg-features-drive-the-successful-decoding-of-binaural-and-monaural-cues-for-sound-localization
#6
Adam Bednar, Francis M Boland, Edmund C Lalor
The human ability to localize sound is essential for monitoring the environment and helps us to analyze complex auditory scenes. Although the acoustic cues mediating sound localization have been established, it remains unknown how these cues are represented in human cortex. In particular, it is still a point of contention whether binaural and monaural cues are processed by the same or distinct cortical networks. In this study, participants listened to a sequence of auditory stimuli from different spatial locations while we recorded their neural activity using electroencephalography (EEG)...
January 19, 2017: European Journal of Neuroscience
https://www.readbyqxmd.com/read/28097504/estimating-the-relative-weights-of-visual-and-auditory-tau-versus-heuristic-based-cues-for-time-to-contact-judgments-in-realistic-familiar-scenes-by-older-and-younger-adults
#7
Behrang Keshavarz, Jennifer L Campos, Patricia R DeLucia, Daniel Oberfeld
Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background)...
January 17, 2017: Attention, Perception & Psychophysics
https://www.readbyqxmd.com/read/28054545/temporal-coherence-structure-rapidly-shapes-neuronal-interactions
#8
Kai Lu, Yanbo Xu, Pingbo Yin, Andrew J Oxenham, Jonathan B Fritz, Shihab A Shamma
Perception of segregated sources is essential in navigating cluttered acoustic environments. A basic mechanism to implement this process is the temporal coherence principle. It postulates that a signal is perceived as emitted from a single source only when all of its features are temporally modulated coherently, causing them to bind perceptually. Here we report on neural correlates of this process as rapidly reshaped interactions in primary auditory cortex, measured in three different ways: as changes in response rates, as adaptations of spectrotemporal receptive fields following stimulation by temporally coherent and incoherent tone sequences, and as changes in spiking correlations during the tone sequences...
January 5, 2017: Nature Communications
https://www.readbyqxmd.com/read/28044025/the-singular-nature-of-auditory-and-visual-scene-analysis-in-autism
#9
REVIEW
I-Fan Lin, Aya Shirama, Nobumasa Kato, Makio Kashino
Individuals with autism spectrum disorder often have difficulty acquiring relevant auditory and visual information in daily environments, despite not being diagnosed as hearing impaired or having low vision. Resent psychophysical and neurophysiological studies have shown that autistic individuals have highly specific individual differences at various levels of information processing, including feature extraction, automatic grouping and top-down modulation in auditory and visual scene analysis. Comparison of the characteristics of scene analysis between auditory and visual modalities reveals some essential commonalities, which could provide clues about the underlying neural mechanisms...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044024/an-auditory-illusion-reveals-the-role-of-streaming-in-the-temporal-misallocation-of-perceptual-objects
#10
Anahita H Mehta, Nori Jacoby, Ifat Yasin, Andrew J Oxenham, Shihab A Shamma
This study investigates the neural correlates and processes underlying the ambiguous percept produced by a stimulus similar to Deutsch's 'octave illusion', in which each ear is presented with a sequence of alternating pure tones of low and high frequencies. The same sequence is presented to each ear, but in opposite phase, such that the left and right ears receive a high-low-high … and a low-high-low … pattern, respectively. Listeners generally report hearing the illusion of an alternating pattern of low and high tones, with all the low tones lateralized to one side and all the high tones lateralized to the other side...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044023/how-is-visual-salience-computed-in-the-brain-insights-from-behaviour-neurobiology-and-modelling
#11
REVIEW
Richard Veale, Ziad M Hafed, Masatoshi Yoshida
Inherent in visual scene analysis is a bottleneck associated with the need to sequentially sample locations with foveating eye movements. The concept of a 'saliency map' topographically encoding stimulus conspicuity over the visual scene has proven to be an efficient predictor of eye movements. Our work reviews insights into the neurobiological implementation of visual salience computation. We start by summarizing the role that different visual brain areas play in salience computation, whether at the level of feature analysis for bottom-up salience or at the level of goal-directed priority maps for output behaviour...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044022/animal-models-for-auditory-streaming
#12
REVIEW
Naoya Itatani, Georg M Klump
Sounds in the natural environment need to be assigned to acoustic sources to evaluate complex auditory scenes. Separating sources will affect the analysis of auditory features of sounds. As the benefits of assigning sounds to specific sources accrue to all species communicating acoustically, the ability for auditory scene analysis is widespread among different animals. Animal studies allow for a deeper insight into the neuronal mechanisms underlying auditory scene analysis. Here, we will review the paradigms applied in the study of auditory scene analysis and streaming of sequential sounds in animal models...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044021/individual-differences-in-visual-motion-perception-and-neurotransmitter-concentrations-in-the-human-brain
#13
Tatsuto Takeuchi, Sanae Yoshimoto, Yasuhiro Shimada, Takanori Kochiyama, Hirohito M Kondo
Recent studies have shown that interindividual variability can be a rich source of information regarding the mechanism of human visual perception. In this study, we examined the mechanisms underlying interindividual variability in the perception of visual motion, one of the fundamental components of visual scene analysis, by measuring neurotransmitter concentrations using magnetic resonance spectroscopy. First, by psychophysically examining two types of motion phenomena-motion assimilation and contrast-we found that, following the presentation of the same stimulus, some participants perceived motion assimilation, while others perceived motion contrast...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044020/auditory-multistability-and-neurotransmitter-concentrations-in-the-human-brain
#14
Hirohito M Kondo, Dávid Farkas, Susan L Denham, Tomohisa Asai, István Winkler
Multistability in perception is a powerful tool for investigating sensory-perceptual transformations, because it produces dissociations between sensory inputs and subjective experience. Spontaneous switching between different perceptual objects occurs during prolonged listening to a sound sequence of tone triplets or repeated words (termed auditory streaming and verbal transformations, respectively). We used these examples of auditory multistability to examine to what extent neurochemical and cognitive factors influence the observed idiosyncratic patterns of switching between perceptual objects...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044019/resolving-the-neural-dynamics-of-visual-and-auditory-scene-processing-in-the-human-brain-a-methodological-approach
#15
REVIEW
Radoslaw Martin Cichy, Santani Teng
In natural environments, visual and auditory stimulation elicit responses across a large set of brain regions in a fraction of a second, yielding representations of the multimodal scene and its properties. The rapid and complex neural dynamics underlying visual and auditory information processing pose major challenges to human cognitive neuroscience. Brain signals measured non-invasively are inherently noisy, the format of neural representations is unknown, and transformations between representations are complex and often nonlinear...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044018/interindividual-variability-in-auditory-scene-analysis-revealed-by-confidence-judgements
#16
C Pelofi, V de Gardelle, P Egré, D Pressnitzer
Because musicians are trained to discern sounds within complex acoustic scenes, such as an orchestra playing, it has been hypothesized that musicianship improves general auditory scene analysis abilities. Here, we compared musicians and non-musicians in a behavioural paradigm using ambiguous stimuli, combining performance, reaction times and confidence measures. We used 'Shepard tones', for which listeners may report either an upward or a downward pitch shift for the same ambiguous tone pair. Musicians and non-musicians performed similarly on the pitch-shift direction task...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044017/cat-and-mouse-search-the-influence-of-scene-and-object-analysis-on-eye-movements-when-targets-change-locations-during-search
#17
Anne P Hillstrom, Joice D Segabinazi, Hayward J Godwin, Simon P Liversedge, Valerie Benson
We explored the influence of early scene analysis and visible object characteristics on eye movements when searching for objects in photographs of scenes. On each trial, participants were shown sequentially either a scene preview or a uniform grey screen (250 ms), a visual mask, the name of the target and the scene, now including the target at a likely location. During the participant's first saccade during search, the target location was changed to: (i) a different likely location, (ii) an unlikely but possible location or (iii) a very implausible location...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044016/is-predictability-salient-a-study-of-attentional-capture-by-auditory-patterns
#18
Rosy Southwell, Anna Baumann, Cécile Gal, Nicolas Barascud, Karl Friston, Maria Chait
In this series of behavioural and electroencephalography (EEG) experiments, we investigate the extent to which repeating patterns of sounds capture attention. Work in the visual domain has revealed attentional capture by statistically predictable stimuli, consistent with predictive coding accounts which suggest that attention is drawn to sensory regularities. Here, stimuli comprised rapid sequences of tone pips, arranged in regular (REG) or random (RAND) patterns. EEG data demonstrate that the brain rapidly recognizes predictable patterns manifested as a rapid increase in responses to REG relative to RAND sequences...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044015/contextual-modulation-of-primary-visual-cortex-by-auditory-signals
#19
REVIEW
L S Petro, A T Paton, L Muckli
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195-201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256-1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044014/a-roadmap-for-the-study-of-conscious-audition-and-its-neural-basis
#20
REVIEW
Andrew R Dykstra, Peter A Cariani, Alexander Gutschalk
How and which aspects of neural activity give rise to subjective perceptual experience-i.e. conscious perception-is a fundamental question of neuroscience. To date, the vast majority of work concerning this question has come from vision, raising the issue of generalizability of prominent resulting theories. However, recent work has begun to shed light on the neural processes subserving conscious perception in other modalities, particularly audition. Here, we outline a roadmap for the future study of conscious auditory perception and its neural basis, paying particular attention to how conscious perception emerges (and of which elements or groups of elements) in complex auditory scenes...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
keyword
keyword
106287
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"