keyword
MENU ▼
Read by QxMD icon Read
search

auditory scene

keyword
https://www.readbyqxmd.com/read/29877161/adapting-hearing-devices-to-the-individual-ear-acoustics-database-and-target-response-correction-functions-for-various-device-styles
#1
Florian Denk, Stephan M A Ernst, Stephan D Ewert, Birger Kollmeier
To achieve a natural sound quality when listening through hearing devices, the sound pressure at the eardrum should replicate that of the open ear, modified only by an insertion gain if desired. A target approximating this reference condition can be computed by applying an appropriate correction function to the pressure observed at the device microphone. Such Target Response Correction Functions (TRCF) can be defined based on the directionally dependent relative transfer function between the location of the hearing device microphone and the eardrum of the open ear...
January 2018: Trends in Hearing
https://www.readbyqxmd.com/read/29863467/a-review-of-auditory-prediction-and-its-potential-role-in-tinnitus-perception
#2
Mithila Durai, Mary G O'Keeffe, Grant D Searchfield
BACKGROUND: The precise mechanisms underlying tinnitus perception and distress are still not fully understood. A recent proposition is that auditory prediction errors and related memory representations may play a role in driving tinnitus perception. It is of interest to further explore this. PURPOSE: To obtain a comprehensive narrative synthesis of current research in relation to auditory prediction and its potential role in tinnitus perception and severity. RESEARCH DESIGN: A narrative review methodological framework was followed...
June 2018: Journal of the American Academy of Audiology
https://www.readbyqxmd.com/read/29858483/modality-independent-coding-of-scene-categories-in-prefrontal-cortex
#3
Yaelan Jung, Bart Larsen, Dirk B Walther
Natural environments convey information through multiple sensory modalities, all of which contribute to people's percepts. Although it has been shown that visual or auditory content of scene categories can be decoded from brain activity, it remains unclear how humans represent scene information beyond a specific sensory modality domain. To address this question, we investigated how categories of scene images and sounds are represented in several brain regions. A group of healthy human subjects (both sexes) participated in the present study, where their brain activity was measured with fMRI while viewing images or listening to sounds of different real-world environments...
June 1, 2018: Journal of Neuroscience: the Official Journal of the Society for Neuroscience
https://www.readbyqxmd.com/read/29787726/hearing-representing-the-aural-wallpaper
#4
David McAlpine
Human listeners appear to represent the textures of sounds through a process of automatic time averaging that exists beyond volition. This process distils likely background sounds into their summary statistics, a computationally efficient way of dealing with complex auditory scenes.
May 21, 2018: Current Biology: CB
https://www.readbyqxmd.com/read/29742332/clinical-features-of-auditory-hallucinations-in-patients-with-dementia-with-lewy-bodies-a-soundtrack-of-visual-hallucinations
#5
Naoko Tsunoda, Mamoru Hashimoto, Tomohisa Ishikawa, Ryuji Fukuhara, Seiji Yuki, Hibiki Tanaka, Yutaka Hatada, Yusuke Miyagawa, Manabu Ikeda
OBJECTIVE: Auditory hallucinations are an important symptom for diagnosing dementia with Lewy bodies (DLB), yet they have received less attention than visual hallucinations. We investigated the clinical features of auditory hallucinations and the possible mechanisms by which they arise in patients with DLB. METHODS: We recruited 124 consecutive patients with probable DLB (diagnosis based on the DLB International Workshop 2005 criteria; study period: June 2007-January 2015) from the dementia referral center of Kumamoto University Hospital...
May 8, 2018: Journal of Clinical Psychiatry
https://www.readbyqxmd.com/read/29712782/activity-in-human-auditory-cortex-represents-spatial-separation-between-concurrent-sounds
#6
Martha M Shiell, Lars Hausfeld, Elia Formisano
Primary and posterior auditory cortex (AC) are known for sensitivity to spatial information, but how this information is processed is not yet understood. AC that is sensitive to spatial manipulations is also modulated by the number of auditory streams present in a scene (Smith et al. 2009), suggesting that spatial and non-spatial cues are integrated for stream segregation. We reasoned that if this is the case, then it is the distance between sounds rather than their absolute positions that is essential. To test this hypothesis, we measured human brain activity in response to spatially-separated concurrent sounds with functional magnetic resonance imaging at 7 Tesla in five men and five women...
April 30, 2018: Journal of Neuroscience: the Official Journal of the Society for Neuroscience
https://www.readbyqxmd.com/read/29681472/adaptive-and-selective-time-averaging-of-auditory-scenes
#7
Richard McWalter, Josh H McDermott
To overcome variability, estimate scene characteristics, and compress sensory input, perceptual systems pool data into statistical summaries. Despite growing evidence for statistical representations in perception, the underlying mechanisms remain poorly understood. One example of such representations occurs in auditory scenes, where background texture appears to be represented with time-averaged sound statistics. We probed the averaging mechanism using "texture steps"-textures containing subtle shifts in stimulus statistics...
May 7, 2018: Current Biology: CB
https://www.readbyqxmd.com/read/29578754/auditory-task-irrelevance-a-basis-for-inattentional-deafness
#8
Menja Scheer, Heinrich H Bülthoff, Lewis L Chuang
Objective This study investigates the neural basis of inattentional deafness, which could result from task irrelevance in the auditory modality. Background Humans can fail to respond to auditory alarms under high workload situations. This failure, termed inattentional deafness, is often attributed to high workload in the visual modality, which reduces one's capacity for information processing. Besides this, our capacity for processing auditory information could also be selectively diminished if there is no obvious task relevance in the auditory channel...
May 2018: Human Factors
https://www.readbyqxmd.com/read/29563861/assessing-top-down-and-bottom-up-contributions-to-auditory-stream-segregation-and-integration-with-polyphonic-music
#9
Niels R Disbergen, Giancarlo Valente, Elia Formisano, Robert J Zatorre
Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments...
2018: Frontiers in Neuroscience
https://www.readbyqxmd.com/read/29543178/diffraction-kernels-for-interactive-sound-propagation-in-dynamic-environments
#10
Atul Rungta, Carl Schissler, Nicholas Rewkowski, Ravish Mehra, Dinesh Manocha
We present a novel method to generate plausible diffraction effects for interactive sound propagation in dynamic scenes. Our approach precomputes a diffraction kernel for each dynamic object in the scene and combines them with interactive ray tracing algorithms at runtime. A diffraction kernel encapsulates the sound interaction behavior of individual objects in the free field and we present a new source placement algorithm to significantly accelerate the precomputation. Our overall propagation algorithm can handle highly-tessellated or smooth objects undergoing rigid motion...
April 2018: IEEE Transactions on Visualization and Computer Graphics
https://www.readbyqxmd.com/read/29531082/psychophysical-evidence-for-auditory-motion-parallax
#11
Daria Genzel, Michael Schutte, W Owen Brimijoin, Paul R MacNeilage, Lutz Wiegrebe
Distance is important: From an ecological perspective, knowledge about the distance to either prey or predator is vital. However, the distance of an unknown sound source is particularly difficult to assess, especially in anechoic environments. In vision, changes in perspective resulting from observer motion produce a reliable, consistent, and unambiguous impression of depth known as motion parallax. Here we demonstrate with formal psychophysics that humans can exploit auditory motion parallax, i.e., the change in the dynamic binaural cues elicited by self-motion, to assess the relative depths of two sound sources...
April 17, 2018: Proceedings of the National Academy of Sciences of the United States of America
https://www.readbyqxmd.com/read/29530816/structural-and-functional-brain-network-of-human-retrosplenial-cortex
#12
Panlong Li, Han Shan, Shengxiang Liang, Binbin Nie, Shaofeng Duan, Qi Huang, Tianhao Zhang, Xi Sun, Ting Feng, Lin Ma, Baoci Shan, Demin Li, Hua Liu
Retrosplenial cortex (RSC) plays a key role in various cognitive functions. The fiber connectivity of RSC had been reported in rodent and primate studies by tracer injection methods To explore structural and functional connectivity of two sub-regions of RSC, Brodmann area (BA)29 and BA30, we constructed fiber connectivity networks of two sub-regions by diffusion tensor imaging (DTI) tractography based on diffusion magnetic resonance imaging (MRI) and functional connectivity networks by resting-state functional MRI...
May 1, 2018: Neuroscience Letters
https://www.readbyqxmd.com/read/29526594/memory-consolidation-is-linked-to-spindle-mediated-information-processing-during-sleep
#13
Scott A Cairney, Anna Á Váli Guttesen, Nicole El Marj, Bernhard P Staresina
How are brief encounters transformed into lasting memories? Previous research has established the role of non-rapid eye movement (NREM) sleep, along with its electrophysiological signatures of slow oscillations (SOs) and spindles, for memory consolidation [1-4]. In related work, experimental manipulations have demonstrated that NREM sleep provides a window of opportunity to selectively strengthen particular memory traces via the delivery of auditory cues [5-10], a procedure known as targeted memory reactivation (TMR)...
March 19, 2018: Current Biology: CB
https://www.readbyqxmd.com/read/29518569/acoustic-and-higher-level-representations-of-naturalistic-auditory-scenes-in-human-auditory-and-frontal-cortex
#14
Lars Hausfeld, Lars Riecke, Elia Formisano
Often, in everyday life, we encounter auditory scenes comprising multiple simultaneous sounds and succeed to selectively attend to only one sound, typically the most relevant for ongoing behavior. Studies using basic sounds and two-talker stimuli have shown that auditory selective attention aids this by enhancing the neural representations of the attended sound in auditory cortex. It remains unknown, however, whether and how this selective attention mechanism operates on representations of auditory scenes containing natural sounds of different categories...
June 2018: NeuroImage
https://www.readbyqxmd.com/read/29508954/perception-of-scenes-in-different-sensory-modalities-a-result-of-modal-completion
#15
Ronald R Gruber, Richard A Block
Dynamic perception includes amodal and modal completion, along with apparent movement. It fills temporal gaps for single objects. In 2 experiments, using 6 stimulus presentation conditions involving 3 sensory modalities, participants experienced 8-10 sequential stimuli (200 ms each) with interstimulus intervals (ISIs) of 0.25-7.0 s. Experiments focused on spatiotemporal completion (walking), featural completion (object changing), auditory, completion (falling bomb), and haptic changes (insect crawling). After each trial, participants judged whether they experienced the process of "happening " or whether they simply knew that the process must have occurred...
April 2017: American Journal of Psychology
https://www.readbyqxmd.com/read/29467495/representations-of-naturalistic-stimulus-complexity-in-early-and-associative-visual-and-auditory-cortices
#16
Yağmur Güçlütürk, Umut Güçlü, Marcel van Gerven, Rob van Lier
The complexity of sensory stimuli has an important role in perception and cognition. However, its neural representation is not well understood. Here, we characterize the representations of naturalistic visual and auditory stimulus complexity in early and associative visual and auditory cortices. This is realized by means of encoding and decoding analyses of two fMRI datasets in the visual and auditory modalities. Our results implicate most early and some associative sensory areas in representing the complexity of naturalistic sensory stimuli...
February 21, 2018: Scientific Reports
https://www.readbyqxmd.com/read/29398142/low-and-high-frequency-cortical-brain-oscillations-reflect-dissociable-mechanisms-of-concurrent-speech-segregation-in-noise
#17
Anusha Yellamsetty, Gavin M Bidelman
Parsing simultaneous speech requires listeners use pitch-guided segregation which can be affected by the signal-to-noise ratio (SNR) in the auditory scene. The interaction of these two cues may occur at multiple levels within the cortex. The aims of the current study were to assess the correspondence between oscillatory brain rhythms and determine how listeners exploit pitch and SNR cues to successfully segregate concurrent speech. We recorded electrical brain activity while participants heard double-vowel stimuli whose fundamental frequencies (F0s) differed by zero or four semitones (STs) presented in either clean or noise-degraded (+5 dB SNR) conditions...
April 2018: Hearing Research
https://www.readbyqxmd.com/read/29395914/integration-of-visual-information-in-auditory-cortex-promotes-auditory-scene-analysis-through-multisensory-binding
#18
Huriye Atilgan, Stephen M Town, Katherine C Wood, Gareth P Jones, Ross K Maddox, Adrian K C Lee, Jennifer K Bizley
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound...
February 7, 2018: Neuron
https://www.readbyqxmd.com/read/29390738/how-does-the-perceptual-organization-of-a-multi-tone-mixture-interact-with-partial-and-global-loudness-judgments
#19
Michaël Vannier, Nicolas Misdariis, Patrick Susini, Nicolas Grimault
Two experiments were conducted to investigate how the perceptual organization of a multi-tone mixture interacts with global and partial loudness judgments. Grouping (single-object) and segregating (two-object) conditions were created using frequency modulation by applying the same or different modulation frequencies to the odd- and even-rank harmonics. While in Experiment 1 (Exp. 1) the two objects had the same loudness, in Experiment 2 (Exp. 2), loudness level differences (LLD) were introduced (LLD = 6, 12, 18, or 24 phons)...
January 2018: Journal of the Acoustical Society of America
https://www.readbyqxmd.com/read/29380283/sound-changes-that-lead-to-seeing-longer-lasting-shapes
#20
Arthur G Samuel, Kavya Tangella
To survive, people must construct an accurate representation of the world around them. There is a body of research on visual scene analysis, and a largely separate literature on auditory scene analysis. The current study follows up research from the smaller literature on audiovisual scene analysis. Prior work demonstrated that when there is an abrupt size change to a moving object, observers tend to see two objects rather than one-the abrupt visual change enhances visible persistence of the briefly presented different-sized object...
January 29, 2018: Attention, Perception & Psychophysics
keyword
keyword
106287
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"