keyword
MENU ▼
Read by QxMD icon Read
search

auditory scene

keyword
https://www.readbyqxmd.com/read/28102912/different-spatio-temporal-eeg-features-drive-the-successful-decoding-of-binaural-and-monaural-cues-for-sound-localization
#1
Adam Bednar, Francis M Boland, Edmund C Lalor
The human ability to localize sound is essential for monitoring the environment and helps us to analyze complex auditory scenes. Although the acoustic cues mediating sound localization have been established, it remains unknown how these cues are represented in human cortex. In particular, it is still a point of contention whether binaural and monaural cues are processed by the same or distinct cortical networks. In this study, participants listened to a sequence of auditory stimuli from different spatial locations while we recorded their neural activity using electroencephalography (EEG)...
January 19, 2017: European Journal of Neuroscience
https://www.readbyqxmd.com/read/28097504/estimating-the-relative-weights-of-visual-and-auditory-tau-versus-heuristic-based-cues-for-time-to-contact-judgments-in-realistic-familiar-scenes-by-older-and-younger-adults
#2
Behrang Keshavarz, Jennifer L Campos, Patricia R DeLucia, Daniel Oberfeld
Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background)...
January 17, 2017: Attention, Perception & Psychophysics
https://www.readbyqxmd.com/read/28054545/temporal-coherence-structure-rapidly-shapes-neuronal-interactions
#3
Kai Lu, Yanbo Xu, Pingbo Yin, Andrew J Oxenham, Jonathan B Fritz, Shihab A Shamma
Perception of segregated sources is essential in navigating cluttered acoustic environments. A basic mechanism to implement this process is the temporal coherence principle. It postulates that a signal is perceived as emitted from a single source only when all of its features are temporally modulated coherently, causing them to bind perceptually. Here we report on neural correlates of this process as rapidly reshaped interactions in primary auditory cortex, measured in three different ways: as changes in response rates, as adaptations of spectrotemporal receptive fields following stimulation by temporally coherent and incoherent tone sequences, and as changes in spiking correlations during the tone sequences...
January 5, 2017: Nature Communications
https://www.readbyqxmd.com/read/28044025/the-singular-nature-of-auditory-and-visual-scene-analysis-in-autism
#4
REVIEW
I-Fan Lin, Aya Shirama, Nobumasa Kato, Makio Kashino
Individuals with autism spectrum disorder often have difficulty acquiring relevant auditory and visual information in daily environments, despite not being diagnosed as hearing impaired or having low vision. Resent psychophysical and neurophysiological studies have shown that autistic individuals have highly specific individual differences at various levels of information processing, including feature extraction, automatic grouping and top-down modulation in auditory and visual scene analysis. Comparison of the characteristics of scene analysis between auditory and visual modalities reveals some essential commonalities, which could provide clues about the underlying neural mechanisms...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044024/an-auditory-illusion-reveals-the-role-of-streaming-in-the-temporal-misallocation-of-perceptual-objects
#5
Anahita H Mehta, Nori Jacoby, Ifat Yasin, Andrew J Oxenham, Shihab A Shamma
This study investigates the neural correlates and processes underlying the ambiguous percept produced by a stimulus similar to Deutsch's 'octave illusion', in which each ear is presented with a sequence of alternating pure tones of low and high frequencies. The same sequence is presented to each ear, but in opposite phase, such that the left and right ears receive a high-low-high … and a low-high-low … pattern, respectively. Listeners generally report hearing the illusion of an alternating pattern of low and high tones, with all the low tones lateralized to one side and all the high tones lateralized to the other side...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044023/how-is-visual-salience-computed-in-the-brain-insights-from-behaviour-neurobiology-and-modelling
#6
REVIEW
Richard Veale, Ziad M Hafed, Masatoshi Yoshida
Inherent in visual scene analysis is a bottleneck associated with the need to sequentially sample locations with foveating eye movements. The concept of a 'saliency map' topographically encoding stimulus conspicuity over the visual scene has proven to be an efficient predictor of eye movements. Our work reviews insights into the neurobiological implementation of visual salience computation. We start by summarizing the role that different visual brain areas play in salience computation, whether at the level of feature analysis for bottom-up salience or at the level of goal-directed priority maps for output behaviour...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044022/animal-models-for-auditory-streaming
#7
REVIEW
Naoya Itatani, Georg M Klump
Sounds in the natural environment need to be assigned to acoustic sources to evaluate complex auditory scenes. Separating sources will affect the analysis of auditory features of sounds. As the benefits of assigning sounds to specific sources accrue to all species communicating acoustically, the ability for auditory scene analysis is widespread among different animals. Animal studies allow for a deeper insight into the neuronal mechanisms underlying auditory scene analysis. Here, we will review the paradigms applied in the study of auditory scene analysis and streaming of sequential sounds in animal models...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044021/individual-differences-in-visual-motion-perception-and-neurotransmitter-concentrations-in-the-human-brain
#8
Tatsuto Takeuchi, Sanae Yoshimoto, Yasuhiro Shimada, Takanori Kochiyama, Hirohito M Kondo
Recent studies have shown that interindividual variability can be a rich source of information regarding the mechanism of human visual perception. In this study, we examined the mechanisms underlying interindividual variability in the perception of visual motion, one of the fundamental components of visual scene analysis, by measuring neurotransmitter concentrations using magnetic resonance spectroscopy. First, by psychophysically examining two types of motion phenomena-motion assimilation and contrast-we found that, following the presentation of the same stimulus, some participants perceived motion assimilation, while others perceived motion contrast...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044020/auditory-multistability-and-neurotransmitter-concentrations-in-the-human-brain
#9
Hirohito M Kondo, Dávid Farkas, Susan L Denham, Tomohisa Asai, István Winkler
Multistability in perception is a powerful tool for investigating sensory-perceptual transformations, because it produces dissociations between sensory inputs and subjective experience. Spontaneous switching between different perceptual objects occurs during prolonged listening to a sound sequence of tone triplets or repeated words (termed auditory streaming and verbal transformations, respectively). We used these examples of auditory multistability to examine to what extent neurochemical and cognitive factors influence the observed idiosyncratic patterns of switching between perceptual objects...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044019/resolving-the-neural-dynamics-of-visual-and-auditory-scene-processing-in-the-human-brain-a-methodological-approach
#10
REVIEW
Radoslaw Martin Cichy, Santani Teng
In natural environments, visual and auditory stimulation elicit responses across a large set of brain regions in a fraction of a second, yielding representations of the multimodal scene and its properties. The rapid and complex neural dynamics underlying visual and auditory information processing pose major challenges to human cognitive neuroscience. Brain signals measured non-invasively are inherently noisy, the format of neural representations is unknown, and transformations between representations are complex and often nonlinear...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044018/interindividual-variability-in-auditory-scene-analysis-revealed-by-confidence-judgements
#11
C Pelofi, V de Gardelle, P Egré, D Pressnitzer
Because musicians are trained to discern sounds within complex acoustic scenes, such as an orchestra playing, it has been hypothesized that musicianship improves general auditory scene analysis abilities. Here, we compared musicians and non-musicians in a behavioural paradigm using ambiguous stimuli, combining performance, reaction times and confidence measures. We used 'Shepard tones', for which listeners may report either an upward or a downward pitch shift for the same ambiguous tone pair. Musicians and non-musicians performed similarly on the pitch-shift direction task...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044017/cat-and-mouse-search-the-influence-of-scene-and-object-analysis-on-eye-movements-when-targets-change-locations-during-search
#12
Anne P Hillstrom, Joice D Segabinazi, Hayward J Godwin, Simon P Liversedge, Valerie Benson
We explored the influence of early scene analysis and visible object characteristics on eye movements when searching for objects in photographs of scenes. On each trial, participants were shown sequentially either a scene preview or a uniform grey screen (250 ms), a visual mask, the name of the target and the scene, now including the target at a likely location. During the participant's first saccade during search, the target location was changed to: (i) a different likely location, (ii) an unlikely but possible location or (iii) a very implausible location...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044016/is-predictability-salient-a-study-of-attentional-capture-by-auditory-patterns
#13
Rosy Southwell, Anna Baumann, Cécile Gal, Nicolas Barascud, Karl Friston, Maria Chait
In this series of behavioural and electroencephalography (EEG) experiments, we investigate the extent to which repeating patterns of sounds capture attention. Work in the visual domain has revealed attentional capture by statistically predictable stimuli, consistent with predictive coding accounts which suggest that attention is drawn to sensory regularities. Here, stimuli comprised rapid sequences of tone pips, arranged in regular (REG) or random (RAND) patterns. EEG data demonstrate that the brain rapidly recognizes predictable patterns manifested as a rapid increase in responses to REG relative to RAND sequences...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044015/contextual-modulation-of-primary-visual-cortex-by-auditory-signals
#14
REVIEW
L S Petro, A T Paton, L Muckli
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195-201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256-1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044014/a-roadmap-for-the-study-of-conscious-audition-and-its-neural-basis
#15
REVIEW
Andrew R Dykstra, Peter A Cariani, Alexander Gutschalk
How and which aspects of neural activity give rise to subjective perceptual experience-i.e. conscious perception-is a fundamental question of neuroscience. To date, the vast majority of work concerning this question has come from vision, raising the issue of generalizability of prominent resulting theories. However, recent work has begun to shed light on the neural processes subserving conscious perception in other modalities, particularly audition. Here, we outline a roadmap for the future study of conscious auditory perception and its neural basis, paying particular attention to how conscious perception emerges (and of which elements or groups of elements) in complex auditory scenes...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044013/contributions-of-low-and-high-level-properties-to-neural-processing-of-visual-scenes-in-the-human-brain
#16
REVIEW
Iris I A Groen, Edward H Silson, Chris I Baker
Visual scene analysis in humans has been characterized by the presence of regions in extrastriate cortex that are selectively responsive to scenes compared with objects or faces. While these regions have often been interpreted as representing high-level properties of scenes (e.g. category), they also exhibit substantial sensitivity to low-level (e.g. spatial frequency) and mid-level (e.g. spatial layout) properties, and it is unclear how these disparate findings can be united in a single framework. In this opinion piece, we suggest that this problem can be resolved by questioning the utility of the classical low- to high-level framework of visual perception for scene processing, and discuss why low- and mid-level properties may be particularly diagnostic for the behavioural goals specific to scene perception as compared to object recognition...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044012/modelling-auditory-attention
#17
REVIEW
Emine Merve Kaya, Mounya Elhilali
Sounds in everyday life seldom appear in isolation. Both humans and machines are constantly flooded with a cacophony of sounds that need to be sorted through and scoured for relevant information-a phenomenon referred to as the 'cocktail party problem'. A key component in parsing acoustic scenes is the role of attention, which mediates perception and behaviour by focusing both sensory and cognitive resources on pertinent information in the stimulus space. The current article provides a review of modelling studies of auditory attention...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28044011/auditory-and-visual-scene-analysis-an-overview
#18
Hirohito M Kondo, Anouk M van Loon, Jun-Ichiro Kawahara, Brian C J Moore
We perceive the world as stable and composed of discrete objects even though auditory and visual inputs are often ambiguous owing to spatial and temporal occluders and changes in the conditions of observation. This raises important questions regarding where and how 'scene analysis' is performed in the brain. Recent advances from both auditory and visual research suggest that the brain does not simply process the incoming scene properties. Rather, top-down processes such as attention, expectations and prior knowledge facilitate scene perception...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28040021/spatially-separating-language-masker-from-target-results-in-spatial-and-linguistic-masking-release
#19
Navin Viswanathan, Kostas Kokkinakis, Brittany T Williams
Several studies demonstrate that in complex auditory scenes, speech recognition is improved when the competing background and target speech differ linguistically. However, such studies typically utilize spatially co-located speech sources which may not fully capture typical listening conditions. Furthermore, co-located presentation may overestimate the observed benefit of linguistic dissimilarity. The current study examines the effect of spatial separation on linguistic release from masking. Results demonstrate that linguistic release from masking does extend to spatially separated sources...
December 2016: Journal of the Acoustical Society of America
https://www.readbyqxmd.com/read/27992390/the-effect-of-cochlear-damage-on-the-sensitivity-to-harmonicity
#20
Damien Bonnard, René Dauman, Catherine Semal, Laurent Demany
OBJECTIVES: A sum of simultaneous pure tones with harmonic relationships (i.e., simple frequency ratios) is normally heard as a single sound, with a single pitch, even when its components are fully resolved in the auditory periphery. This perceptual phenomenon called "harmonic fusion" is thought to play an important role in auditory scene analysis as listeners often have to segregate simultaneous harmonic sounds with different fundamental frequencies. The present study explored the consequences of mild or moderate cochlear hearing loss for the sensitivity to harmonicity and the detection of inharmonicity...
January 2017: Ear and Hearing
keyword
keyword
106287
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"