Read by QxMD icon Read

Neural Computation

Hiroaki Sasaki, Michael U Gutmann, Hayaru Shouno, Aapo Hyvärinen
The statistical dependencies that independent component analysis (ICA) cannot remove often provide rich information beyond the linear independent components. It would thus be very useful to estimate the dependency structure from data. While such models have been proposed, they have usually concentrated on higher-order correlations such as energy (square) correlations. Yet linear correlations are a fundamental and informative form of dependency in many real data sets. Linear correlations are usually completely removed by ICA and related methods so they can only be analyzed by developing new methods that explicitly allow for linearly correlated components...
August 4, 2017: Neural Computation
Alex T Piet, Jeffrey C Erlich, Charles D Kopec, Carlos D Brody
Two-node attractor networks are flexible models for neural activity during decision making. Depending on the network configuration, these networks can model distinct aspects of decisions including evidence integration, evidence categorization, and decision memory. Here, we use attractor networks to model recent causal perturbations of the frontal orienting fields (FOF) in rat cortex during a perceptual decision-making task (Erlich, Brunton, Duan, Hanks, & Brody, 2015). We focus on a striking feature of the perturbation results...
August 4, 2017: Neural Computation
Liming Yang, Zhuo Ren, Yidan Wang, Hongwei Dong
This letter proposes a robust regression framework with nonconvex loss function. Two regression formulations are presented based on the Laplace kernel-induced loss (LK-loss). Moreover, we illustrate that the LK-loss function is a nice approximation for the zero-norm. However, nonconvexity of the LK-loss makes it difficult to optimize. A continuous optimization method is developed to solve the proposed framework. The problems are formulated as DC (difference of convex functions) programming. The corresponding DC algorithms (DCAs) converge linearly...
August 4, 2017: Neural Computation
Claudia Lainscsek, Jonathan Weyhenmeyer, Sydney S Cash, Terrence J Sejnowski
High-density electrocorticogram (ECoG) electrodes are capable of recording neurophysiological data with high temporal resolution with wide spatial coverage. These recordings are a window to understanding how the human brain processes information and subsequently behaves in healthy and pathologic states. Here, we describe and implement delay differential analysis (DDA) for the characterization of ECoG data obtained from human patients with intractable epilepsy. DDA is a time-domain analysis framework based on embedding theory in nonlinear dynamics that reveals the nonlinear invariant properties of an unknown dynamical system...
August 4, 2017: Neural Computation
Cengiz Pehlevan, Sreyas Mohan, Dmitri B Chklovskii
Blind source separation-the extraction of independent sources from a mixture-is an important problem for both artificial and natural signal processing. Here, we address a special case of this problem when sources (but not the mixing matrix) are known to be nonnegative-for example, due to the physical nature of the sources. We search for the solution to this problem that can be implemented using biologically plausible neural networks. Specifically, we consider the online setting where the data set is streamed to a neural network...
August 4, 2017: Neural Computation
Jia Cai, Hongwei Sun
Canonical correlation analysis (CCA) is a useful tool in detecting the latent relationship between two sets of multivariate variables. In theoretical analysis of CCA, a regularization technique is utilized to investigate the consistency of its analysis. This letter addresses the consistency property of CCA from a least squares view. We construct a constrained empirical risk minimization framework of CCA and apply a two-stage randomized Kaczmarz method to solve it. In the first stage, we remove the noise, and in the second stage, we compute the canonical weight vectors...
October 2017: Neural Computation
P N Loxley
The two-dimensional Gabor function is adapted to natural image statistics, leading to a tractable probabilistic generative model that can be used to model simple cell receptive field profiles, or generate basis functions for sparse coding applications. Learning is found to be most pronounced in three Gabor function parameters representing the size and spatial frequency of the two-dimensional Gabor function and characterized by a nonuniform probability distribution with heavy tails. All three parameters are found to be strongly correlated, resulting in a basis of multiscale Gabor functions with similar aspect ratios and size-dependent spatial frequencies...
October 2017: Neural Computation
Yusen Zhan, Haitham Bou Ammar, Matthew E Taylor
Policy search is a class of reinforcement learning algorithms for finding optimal policies in control problems with limited feedback. These methods have been shown to be successful in high-dimensional problems such as robotics control. Though successful, current methods can lead to unsafe policy parameters that potentially could damage hardware units. Motivated by such constraints, we propose projection-based methods for safe policies. These methods, however, can handle only convex policy constraints. In this letter, we propose the first safe policy search reinforcement learner capable of operating under nonconvex policy constraints...
October 2017: Neural Computation
Stefano Recanatesi, Mikhail Katkov, Misha Tsodyks
Human memory is capable of retrieving similar memories to a just retrieved one. This associative ability is at the base of our everyday processing of information. Current models of memory have not been able to underpin the mechanism that the brain could use in order to actively exploit similarities between memories. The current idea is that to induce transitions in attractor neural networks, it is necessary to extinguish the current memory. We introduce a novel mechanism capable of inducing transitions between memories where similarities between memories are actively exploited by the neural dynamics to retrieve a new memory...
October 2017: Neural Computation
Karl J Friston, Marco Lin, Christopher D Frith, Giovanni Pezzulo, J Allan Hobson, Sasha Ondobaka
This article offers a formal account of curiosity and insight in terms of active (Bayesian) inference. It deals with the dual problem of inferring states of the world and learning its statistical structure. In contrast to current trends in machine learning (e.g., deep learning), we focus on how people attain insight and understanding using just a handful of observations, which are solicited through curious behavior. We use simulations of abstract rule learning and approximate Bayesian inference to show that minimizing (expected) variational free energy leads to active sampling of novel contingencies...
October 2017: Neural Computation
Kiret Dhindsa, Dean Carcone, Suzanna Becker
Brain-computer interfaces (BCIs) allow users to control a device by interpreting their brain activity. For simplicity, these devices are designed to be operated by purposefully modulating specific predetermined neurophysiological signals, such as the sensorimotor rhythm. However, the ability to modulate a given neurophysiological signal is highly variable across individuals, contributing to the inconsistent performance of BCIs for different users. These differences suggest that individuals who experience poor BCI performance with one class of brain signals might have good results with another...
October 2017: Neural Computation
Rasmus E Røge, Kristoffer H Madsen, Mikkel N Schmidt, Morten Mørup
Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises-Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Monte Carlo sampling...
October 2017: Neural Computation
Tiger W Lin, Anup Das, Giri P Krishnan, Maxim Bazhenov, Terrence J Sejnowski
With our ability to record more neurons simultaneously, making sense of these data is a challenge. Functional connectivity is one popular way to study the relationship of multiple neural signals. Correlation-based methods are a set of currently well-used techniques for functional connectivity estimation. However, due to explaining away and unobserved common inputs (Stevenson, Rebesco, Miller, & Körding, 2008 ), they produce spurious connections. The general linear model (GLM), which models spike trains as Poisson processes (Okatan, Wilson, & Brown, 2005 ; Truccolo, Eden, Fellows, Donoghue, & Brown, 2005 ; Pillow et al...
October 2017: Neural Computation
Ronghua Shang, Chiyang Liu, Yang Meng, Licheng Jiao, Rustam Stolkin
Nonnegative matrix factorization (NMF) is well known to be an effective tool for dimensionality reduction in problems involving big data. For this reason, it frequently appears in many areas of scientific and engineering literature. This letter proposes a novel semisupervised NMF algorithm for overcoming a variety of problems associated with NMF algorithms, including poor use of prior information, negative impact on manifold structure of the sparse constraint, and inaccurate graph construction. Our proposed algorithm, nonnegative matrix factorization with rank regularization and hard constraint (NMFRC), incorporates label information into data representation as a hard constraint, which makes full use of prior information...
September 2017: Neural Computation
Romain D Cazé, Sarah Jarvis, Amanda J Foust, Simon R Schultz
Hearing, vision, touch: underlying all of these senses is stimulus selectivity, a robust information processing operation in which cortical neurons respond more to some stimuli than to others. Previous models assume that these neurons receive the highest weighted input from an ensemble encoding the preferred stimulus, but dendrites enable other possibilities. Nonlinear dendritic processing can produce stimulus selectivity based on the spatial distribution of synapses, even if the total preferred stimulus weight does not exceed that of nonpreferred stimuli...
September 2017: Neural Computation
William Softky, Criscillia Benford
Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships...
September 2017: Neural Computation
Clemens Korndörfer, Ekkehard Ullner, Jordi García-Ojalvo, Gordon Pipa
Spike synchrony, which occurs in various cortical areas in response to specific perception, action, and memory tasks, has sparked a long-standing debate on the nature of temporal organization in cortex. One prominent view is that this type of synchrony facilitates the binding or grouping of separate stimulus components. We argue instead for a more general function: a measure of the prior probability of incoming stimuli, implemented by long-range, horizontal, intracortical connections. We show that networks of this kind-pulse-coupled excitatory spiking networks in a noisy environment-can provide a sufficient substrate for stimulus-dependent spike synchrony...
September 2017: Neural Computation
Sensen Liu, ShiNung Ching
We consider the problem of optimizing information-theoretic quantities in recurrent networks via synaptic learning. In contrast to feedforward networks, the recurrence presents a key challenge insofar as an optimal learning rule must aggregate the joint distribution of the whole network. This challenge, in particular, makes a local policy (i.e., one that depends on only pairwise interactions) difficult. Here, we report a local metaplastic learning rule that performs approximate optimization by estimating whole-network statistics through the use of several slow, nested dynamical variables...
September 2017: Neural Computation
Sacha Sokoloski
In order to interact intelligently with objects in the world, animals must first transform neural population responses into estimates of the dynamic, unknown stimuli that caused them. The Bayesian solution to this problem is known as a Bayes filter, which applies Bayes' rule to combine population responses with the predictions of an internal model. The internal model of the Bayes filter is based on the true stimulus dynamics, and in this note, we present a method for training a theoretical neural circuit to approximately implement a Bayes filter when the stimulus dynamics are unknown...
September 2017: Neural Computation
Waseem Rawat, Zenghui Wang
Convolutional neural networks (CNNs) have been applied to visual tasks since the late 1980s. However, despite a few scattered applications, they were dormant until the mid-2000s when developments in computing power and the advent of large amounts of labeled data, supplemented by improved algorithms, contributed to their advancement and brought them to the forefront of a neural network renaissance that has seen rapid progression since 2012. In this review, which focuses on the application of CNNs to image classification tasks, we cover their development, from their predecessors up to recent state-of-the-art deep learning systems...
September 2017: Neural Computation
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"