journal
MENU ▼
Read by QxMD icon Read
search

Neural Computation

journal
https://www.readbyqxmd.com/read/28333592/dc-algorithm-for-extended-robust-support-vector-machine
#1
Shuhei Fujiwara, Akiko Takeda, Takafumi Kanamori
Nonconvex variants of support vector machines (SVMs) have been developed for various purposes. For example, robust SVMs attain robustness to outliers by using a nonconvex loss function, while extended [Formula: see text]-SVM (E[Formula: see text]-SVM) extends the range of the hyperparameter by introducing a nonconvex constraint. Here, we consider an extended robust support vector machine (ER-SVM), a robust variant of E[Formula: see text]-SVM. ER-SVM combines two types of nonconvexity from robust SVMs and E[Formula: see text]-SVM...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333591/evidence-accumulation-and-change-rate-inference-in-dynamic-environments
#2
Adrian E Radillo, Alan Veliz-Cuba, Krešimir Josić, Zachary P Kilpatrick
In a constantly changing world, animals must account for environmental volatility when making decisions. To appropriately discount older, irrelevant information, they need to learn the rate at which the environment changes. We develop an ideal observer model capable of inferring the present state of the environment along with its rate of change. Key to this computation is an update of the posterior probability of all possible change point counts. This computation can be challenging, as the number of possibilities grows rapidly with time...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333590/mean-first-passage-memory-lifetimes-by-reducing-complex-synapses-to-simple-synapses
#3
Terry Elliott
Memory models that store new memories by forgetting old ones have memory lifetimes that are rather short and grow only logarithmically in the number of synapses. Attempts to overcome these deficits include "complex" models of synaptic plasticity in which synapses possess internal states governing the expression of synaptic plasticity. Integrate-and-express, filter-based models of synaptic plasticity propose that synapses act as low-pass filters, integrating plasticity induction signals before expressing synaptic plasticity...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333589/a-unified-theory-of-neuro-mri-data-shows-scale-free-nature-of-connectivity-modes
#4
Vitaly L Galinsky, Lawrence R Frank
A primary goal of many neuroimaging studies that use magnetic resonance imaging (MRI) is to deduce the structure-function relationships in the human brain using data from the three major neuro-MRI modalities: high-resolution anatomical, diffusion tensor imaging, and functional MRI. To date, the general procedure for analyzing these data is to combine the results derived independently from each of these modalities. In this article, we develop a new theoretical and computational approach for combining these different MRI modalities into a powerful and versatile framework that combines our recently developed methods for morphological shape analysis and segmentation, simultaneous local diffusion estimation and global tractography, and nonlinear and nongaussian spatial-temporal activation pattern classification and ranking, as well as our fast and accurate approach for nonlinear registration between modalities...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333588/multiassociative-memory-recurrent-synapses-increase-storage-capacity
#5
Marcelo Matheus Gauy, Florian Meier, Angelika Steger
The connection density of nearby neurons in the cortex has been observed to be around 0.1, whereas the longer-range connections are present with much sparser density (Kalisman, Silberberg, & Markram, 2005). We propose a memory association model that qualitatively explains these empirical observations. The model we consider is a multiassociative, sparse, Willshaw-like model consisting of binary threshold neurons and binary synapses. It uses recurrent synapses for iterative retrieval of stored memories. We quantify the usefulness of recurrent synapses by simulating the model for small network sizes and by doing a precise mathematical analysis for large network sizes...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333587/variational-latent-gaussian-process-for-recovering-single-trial-dynamics-from-population-spike-trains
#6
Yuan Zhao, Il Memming Park
When governed by underlying low-dimensional dynamics, the interdependence of simultaneously recorded populations of neurons can be explained by a small number of shared factors, or a low-dimensional trajectory. Recovering these latent trajectories, particularly from single-trial population recordings, may help us understand the dynamics that drive neural computation. However, due to the biophysical constraints and noise in the spike trains, inferring trajectories from data is a challenging statistical problem in general...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333586/modulation-of-context-dependent-spatiotemporal-patterns-within-packets-of-spiking-activity
#7
Miho Itoh, Timothée Leleu
Recent experiments have shown that stereotypical spatiotemporal patterns occur during brief packets of spiking activity in the cortex, and it has been suggested that top-down inputs can modulate these patterns according to the context. We propose a simple model that may explain important features of these experimental observations and is analytically tractable. The key mechanism underlying this model is that context-dependent top-down inputs can modulate the effective connection strengths between neurons because of short-term synaptic depression...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333585/fast-estimation-of-approximate-matrix-ranks-using-spectral-densities
#8
Shashanka Ubaru, Yousef Saad, Abd-Krim Seghouane
Many machine learning and data-related applications require the knowledge of approximate ranks of large data matrices at hand. This letter presents two computationally inexpensive techniques to estimate the approximate ranks of such matrices. These techniques exploit approximate spectral densities, popular in physics, which are probability density distributions that measure the likelihood of finding eigenvalues of the matrix at a given point on the real line. Integrating the spectral density over an interval gives the eigenvalue count of the matrix in that interval...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333584/unsupervised-2d-dimensionality-reduction-with-adaptive-structure-learning
#9
Xiaowei Zhao, Feiping Nie, Sen Wang, Jun Guo, Pengfei Xu, Xiaojiang Chen
In recent years, unsupervised two-dimensional (2D) dimensionality reduction methods for unlabeled large-scale data have made progress. However, performance of these degrades when the learning of similarity matrix is at the beginning of the dimensionality reduction process. A similarity matrix is used to reveal the underlying geometry structure of data in unsupervised dimensionality reduction methods. Because of noise data, it is difficult to learn the optimal similarity matrix. In this letter, we propose a new dimensionality reduction model for 2D image matrices: unsupervised 2D dimensionality reduction with adaptive structure learning (DRASL)...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333583/an-approximation-of-the-error-backpropagation-algorithm-in-a-predictive-coding-network-with-local-hebbian-synaptic-plasticity
#10
James C R Whittington, Rafal Bogacz
To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons...
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28333582/erratum-to-a-note-on-divergences
#11
Xiao Liang
No abstract text is available yet for this article.
March 23, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181880/parameter-identifiability-in-statistical-machine-learning-a-review
#12
Zhi-Yong Ran, Bao-Gang Hu
This review examines the relevance of parameter identifiability for statistical models used in machine learning. In addition to defining main concepts, we address several issues of identifiability closely related to machine learning, showing the advantages and disadvantages of state-of-the-art research and demonstrating recent progress. First, we review criteria for determining the parameter structure of models from the literature. This has three related issues: parameter identifiability, parameter redundancy, and reparameterization...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181878/using-inspiration-from-synaptic-plasticity-rules-to-optimize-traffic-flow-in-distributed-engineered-networks
#13
Jonathan Y Suen, Saket Navlakha
Controlling the flow and routing of data is a fundamental problem in many distributed networks, including transportation systems, integrated circuits, and the Internet. In the brain, synaptic plasticity rules have been discovered that regulate network activity in response to environmental inputs, which enable circuits to be stable yet flexible. Here, we develop a new neuro-inspired model for network flow control that depends only on modifying edge weights in an activity-dependent manner. We show how two fundamental plasticity rules, long-term potentiation and long-term depression, can be cast as a distributed gradient descent algorithm for regulating traffic flow in engineered networks...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181879/parameter-estimation-of-nonlinear-systems-by-dynamic-cuckoo-search
#14
Qixiang Liao, Shudao Zhou, Hanqing Shi, Weilai Shi
In order to address with the problem of the traditional or improved cuckoo search (CS) algorithm, we propose a dynamic adaptive cuckoo search with crossover operator (DACS-CO) algorithm. Normally, the parameters of the CS algorithm are kept constant or adapted by empirical equation that may result in decreasing the efficiency of the algorithm. In order to solve the problem, a feedback control scheme of algorithm parameters is adopted in cuckoo search; Rechenberg's 1/5 criterion, combined with a learning strategy, is used to evaluate the evolution process...
April 2017: Neural Computation
https://www.readbyqxmd.com/read/28181877/an-in-silico-biomarker-based-method-for-the-evaluation-of-virtual-neuropsychiatric-drug-effects
#15
Peter J Siekmeier
The recent explosion in neuroscience research has markedly increased our understanding of the neurobiological correlates of many psychiatric illnesses, but this has unfortunately not translated into more effective pharmacologic treatments for these conditions. At the same time, researchers have increasingly sought out biological markers, or biomarkers, as a way to categorize psychiatric illness, as these are felt to be closer to underlying genetic and neurobiological vulnerabilities. While biomarker-based drug discovery approaches have tended to employ in vivo (e...
April 2017: Neural Computation
https://www.readbyqxmd.com/read/28181876/information-maximization-explains-the-sparseness-of-presynaptic-neural-response
#16
Minjoon Kouh
In a sensory neural network, where a population of presynaptic neurons sends information to a downstream neuron, maximizing information transmission depends on utilizing the full operating range of the output of the postsynaptic neuron. Because the convergence of presynaptic inputs naturally biases higher outputs, a sparse input distribution would counter such bias and optimize information transmission.
April 2017: Neural Computation
https://www.readbyqxmd.com/read/28181875/on-the-dynamical-interplay-of-positive-and-negative-affects
#17
Jonathan Touboul, Alberto Romagnoni, Robert Schwartz
Emotional disorders and psychological flourishing are the result of complex interactions between positive and negative affects that depend on external events and the subject's internal representations. Based on psychological data, we mathematically model the dynamical balance between positive and negative affects as a function of the response to external positive and negative events. This modeling allows the investigation of the relative impact of two leading forms of therapy on affect balance. The model uses a delay differential equation to analytically study the bifurcation diagram of the system...
April 2017: Neural Computation
https://www.readbyqxmd.com/read/28181874/evolving-network-model-that-almost-regenerates-epileptic-data
#18
G Manjunath
In many realistic networks, the edges representing the interactions between nodes are time varying. Evidence is growing that the complex network that models the dynamics of the human brain has time-varying interconnections, that is, the network is evolving. Based on this evidence, we construct a patient- and data-specific evolving network model (comprising discrete-time dynamical systems) in which epileptic seizures or their terminations in the brain are also determined by the nature of the time-varying interconnections between the nodes...
April 2017: Neural Computation
https://www.readbyqxmd.com/read/28095203/avoiding-optimal-mean-%C3%A2-2-1-norm-maximization-based-robust-pca-for-reconstruction
#19
Minnan Luo, Feiping Nie, Xiaojun Chang, Yi Yang, Alexander G Hauptmann, Qinghua Zheng
Robust principal component analysis (PCA) is one of the most important dimension-reduction techniques for handling high-dimensional data with outliers. However, most of the existing robust PCA presupposes that the mean of the data is zero and incorrectly utilizes the average of data as the optimal mean of robust PCA. In fact, this assumption holds only for the squared [Formula: see text]-norm-based traditional PCA. In this letter, we equivalently reformulate the objective of conventional PCA and learn the optimal projection directions by maximizing the sum of projected difference between each pair of instances based on [Formula: see text]-norm...
April 2017: Neural Computation
https://www.readbyqxmd.com/read/28095199/multiway-array-decomposition-of-eeg-spectrum-implications-of-its-stability-for-the-exploration-of-large-scale-brain-networks
#20
Radek Mareček, Martin Lamoš, René Labounek, Marek Bartoň, Tomáš Slavíček, Michal Mikl, Ivan Rektor, Milan Brázdil
Multiway array decomposition methods have been shown to be promising statistical tools for identifying neural activity in the EEG spectrum. They blindly decompose the EEG spectrum into spatial-temporal-spectral patterns by taking into account inherent relationships among signals acquired at different frequencies and sensors. Our study evaluates the stability of spatial-temporal-spectral patterns derived by one particular method, parallel factor analysis (PARAFAC). We focused on patterns' stability over time and in population and divided the complete data set containing data from 50 healthy subjects into several subsets...
April 2017: Neural Computation
journal
journal
31799
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"