Read by QxMD icon Read

Neural Computation

Yingzhuo Zhang, Noa Malem-Shinitski, Stephen A Allsop, Kay Tye, Demba Ba
A fundamental problem in neuroscience is to characterize the dynamics of spiking from the neurons in a circuit that is involved in learning about a stimulus or a contingency. A key limitation of current methods to analyze neural spiking data is the need to collapse neural activity over time or trials, which may cause the loss of information pertinent to understanding the function of a neuron or circuit. We introduce a new method that can determine not only the trial-to-trial dynamics that accompany the learning of a contingency by a neuron, but also the latency of this learning with respect to the onset of a conditioned stimulus...
January 30, 2018: Neural Computation
Tomas Van Pottelbergh, Guillaume Drion, Rodolphe Sepulchre
By controlling the state of neuronal populations, neuromodulators ultimately affect behavior. A key neuromodulation mechanism is the alteration of neuronal excitability via the modulation of ion channel expression. This type of neuromodulation is normally studied with conductance-based models, but those models are computationally challenging for large-scale network simulations needed in population studies. This article studies the modulation properties of the multiquadratic integrate-and-fire model, a generalization of the classical quadratic integrate-and-fire model...
January 30, 2018: Neural Computation
Shoubo Hu, Zhitang Chen, Laiwan Chan
Although nonstationary data are more common in the real world, most existing causal discovery methods do not take nonstationarity into consideration. In this article, we propose a kernel embedding-based approach, ENCI, for nonstationary causal model inference where data are collected from multiple domains with varying distributions. In ENCI, we transform the complicated relation of a cause-effect pair into a linear model of variables of which observations correspond to the kernel embeddings of the cause-and-effect distributions in different domains...
January 30, 2018: Neural Computation
Kyuengbo Min, Masami Iwamoto, Shinji Kakei, Hideyuki Kimpara
Humans are able to robustly maintain desired motion and posture under dynamically changing circumstances, including novel conditions. To accomplish this, the brain needs to optimize the synergistic control between muscles against external dynamic factors. However, previous related studies have usually simplified the control of multiple muscles using two opposing muscles, which are minimum actuators to simulate linear feedback control. As a result, they have been unable to analyze how muscle synergy contributes to motion control robustness in a biological system...
January 30, 2018: Neural Computation
Adam S Charles, Mijung Park, J Patrick Weller, Gregory D Horwitz, Jonathan W Pillow
Neurons in many brain areas exhibit high trial-to-trial variability, with spike counts that are overdispersed relative to a Poisson distribution. Recent work (Goris, Movshon, & Simoncelli, 2014) has proposed to explain this variability in terms of a multiplicative interaction between a stochastic gain variable and a stimulus-dependent Poisson firing rate, which produces quadratic relationships between spike count mean and variance. Here we examine this quadratic assumption and propose a more flexible family of models that can account for a more diverse set of mean-variance relationships...
January 30, 2018: Neural Computation
Alireza Saeedi, Mostafa Jannesari, Shahriar Gharibzadeh, Fatemeh Bakouie
Self-organized criticality (SOC) and stochastic oscillations (SOs) are two theoretically contradictory phenomena that are suggested to coexist in the brain. Recently it has been shown that an accumulation-release process like sandpile dynamics can generate SOC and SOs simultaneously. We considered the effect of the network structure on this coexistence and showed that the sandpile dynamics on a small-world network can produce two power law regimes along with two groups of SOs-two peaks in the power spectrum of the generated signal simultaneously...
January 30, 2018: Neural Computation
Caglar Gulcehre, Sarath Chandar, Kyunghyun Cho, Yoshua Bengio
We extend the neural Turing machine (NTM) model into a dynamic neural Turing machine (D-NTM) by introducing trainable address vectors. This addressing scheme maintains for each memory cell two separate vectors, content and address vectors. This allows the D-NTM to learn a wide variety of location-based addressing strategies, including both linear and nonlinear ones. We implement the D-NTM with both continuous and discrete read and write mechanisms. We investigate the mechanisms and effects of learning to read and write into a memory through experiments on Facebook bAbI tasks using both a feedforward and GRU controller...
January 30, 2018: Neural Computation
Johannes Leugering, Gordon Pipa
A neuronal population is a computational unit that receives a multivariate, time-varying input signal and creates a related multivariate output. These neural signals are modeled as stochastic processes that transmit information in real time, subject to stochastic noise. In a stationary environment, where the input signals can be characterized by constant statistical properties, the systematic relationship between its input and output processes determines the computation carried out by a population. When these statistical characteristics unexpectedly change, the population needs to adapt to its new environment if it is to maintain stable operation...
January 17, 2018: Neural Computation
Wentao Huang, Kechen Zhang
While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality. This article focuses on effective approximation methods for evaluating mutual information in the context of neural population coding. For large but finite neural populations, we derive several information-theoretic asymptotic bounds and approximation formulas that remain valid in high-dimensional spaces...
January 17, 2018: Neural Computation
Kun Zhan, Jinhui Shi, Jing Wang, Haibo Wang, Yuange Xie
Most existing multiview clustering methods require that graph matrices in different views are computed beforehand and that each graph is obtained independently. However, this requirement ignores the correlation between multiple views. In this letter, we tackle the problem of multiview clustering by jointly optimizing the graph matrix to make full use of the data correlation between views. With the interview correlation, a concept factorization-based multiview clustering method is developed for data integration, and the adaptive method correlates the affinity weights of all views...
January 17, 2018: Neural Computation
I Tal, M Abeles
This letter presents a noninvasive imaging technique that captures the exact timing and locations of cortical activity sequences that are specific to a cognitive process. These precise spatiotemporal sequences can be detected in the human brain as specific time-position pattern associated with a cognitive specific time-position pattern associated with a cognitive task. They are consistent with direct measurements of population activity recorded in nonhuman primates, thus suggesting that specific time-position patterns associated with a cognitive task can be identified...
January 17, 2018: Neural Computation
Wei Wang, Hao Wang, Chen Zhang, Yang Gao
Learning an appropriate distance metric plays a substantial role in the success of many learning machines. Conventional metric learning algorithms have limited utility when the training and test samples are drawn from related but different domains (i.e., source domain and target domain). In this letter, we propose two novel metric learning algorithms for domain adaptation in an information-theoretic setting, allowing for discriminating power transfer and standard learning machine propagation across two domains...
January 17, 2018: Neural Computation
Joseph Snider
Neurons integrate information from many neighbors when they process information. Inputs to a given neuron are thus indistinguishable from one another. Under the assumption that neurons maximize their information storage, indistinguishability is shown to place a strong constraint on the distribution of strengths between neurons. The distribution of individual synapse strengths is found to follow a modified Boltzmann distribution with strength proportional to [Formula: see text]. The model is shown to be consistent with experimental data from Caenorhabditis elegans connectivity and in vivo synaptic strength measurements...
January 17, 2018: Neural Computation
Dorian Florescu, Daniel Coca
Inferring mathematical models of sensory processing systems directly from input-output observations, while making the fewest assumptions about the model equations and the types of measurements available, is still a major issue in computational neuroscience. This letter introduces two new approaches for identifying sensory circuit models consisting of linear and nonlinear filters in series with spiking neuron models, based only on the sampled analog input to the filter and the recorded spike train output of the spiking neuron...
January 17, 2018: Neural Computation
Takashi Kanamaru
In this study, I considered quantifying the strength of chaos in the population firing rate of a pulse-coupled neural network. In particular, I considered the dynamics where the population firing rate is chaotic and the firing of each neuron is stochastic. I calculated a time histogram of firings to show the variation in the population firing rate over time. To smooth this histogram, I used Bayesian adaptive regression splines and a gaussian filter. The nonlinear prediction method, based on reconstruction, was applied to a sequence of interpeak intervals in the smoothed time histogram of firings...
March 2018: Neural Computation
Wiktor Młynarski, Josh H McDermott
Interaction with the world requires an organism to transform sensory signals into representations in which behaviorally meaningful properties of the environment are made explicit. These representations are derived through cascades of neuronal processing stages in which neurons at each stage recode the output of preceding stages. Explanations of sensory coding may thus involve understanding how low-level patterns are combined into more complex structures. To gain insight into such midlevel representations for sound, we designed a hierarchical generative model of natural sounds that learns combinations of spectrotemporal features from natural stimulus statistics...
March 2018: Neural Computation
Melika Payvand, Luke Theogarajan
In this letter, we have implemented and compared two neural coding algorithms in the networks of spiking neurons: Winner-takes-all (WTA) and winners-share-all (WSA). Winners-Share-All exploits the code space provided by the temporal code by training a different combination of [Formula: see text] out of [Formula: see text] neurons to fire together in response to different patterns, while WTA uses a one-hot-coding to respond to distinguished patterns. Using WSA, the maximum value of [Formula: see text] in order to maximize information capacity using [Formula: see text] output neurons was theoretically determined and utilized...
March 2018: Neural Computation
Aaron R Voelker, Chris Eliasmith
Researchers building spiking neural networks face the challenge of improving the biological plausibility of their model networks while maintaining the ability to quantitatively characterize network behavior. In this work, we extend the theory behind the neural engineering framework (NEF), a method of building spiking dynamical networks, to permit the use of a broad class of synapse models while maintaining prescribed dynamics up to a given order. This theory improves our understanding of how low-level synaptic properties alter the accuracy of high-level computations in spiking dynamical networks...
March 2018: Neural Computation
Aritra Bhaduri, Amitava Banerjee, Subhrajit Roy, Sougata Kar, Arindam Basu
We present a neuromorphic current mode implementation of a spiking neural classifier with lumped square law dendritic nonlinearity. It has been shown previously in software simulations that such a system with binary synapses can be trained with structural plasticity algorithms to achieve comparable classification accuracy with fewer synaptic resources than conventional algorithms. We show that even in real analog systems with manufacturing imperfections (CV of 23.5% and 14.4% for dendritic branch gains and leaks respectively), this network is able to produce comparable results with fewer synaptic resources...
March 2018: Neural Computation
Jing Fang, Naima Rüther, Christian Bellebaum, Laurenz Wiskott, Sen Cheng
The experimental evidence on the interrelation between episodic memory and semantic memory is inconclusive. Are they independent systems, different aspects of a single system, or separate but strongly interacting systems? Here, we propose a computational role for the interaction between the semantic and episodic systems that might help resolve this debate. We hypothesize that episodic memories are represented as sequences of activation patterns. These patterns are the output of a semantic representational network that compresses the high-dimensional sensory input...
February 2018: Neural Computation
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"