journal
MENU ▼
Read by QxMD icon Read
search

Neural Computation

journal
https://www.readbyqxmd.com/read/28181880/parameter-identifiability-in-statistical-machine-learning-a-review
#1
Zhi-Yong Ran, Bao-Gang Hu
This review examines the relevance of parameter identifiability for statistical models used in machine learning. In addition to defining main concepts, we address several issues of identifiability closely related to machine learning, showing the advantages and disadvantages of state-of-the-art research and demonstrating recent progress. First, we review criteria for determining the parameter structure of models from the literature. This has three related issues: parameter identifiability, parameter redundancy, and reparameterization...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181879/parameter-estimation-of-nonlinear-systems-by-dynamic-cuckoo-search
#2
Qixiang Liao, Shudao Zhou, Hanqing Shi, Weilai Shi
In order to address with the problem of the traditional or improved cuckoo search (CS) algorithm, we propose a dynamic adaptive cuckoo search with crossover operator (DACS-CO) algorithm. Normally, the parameters of the CS algorithm are kept constant or adapted by empirical equation that may result in decreasing the efficiency of the algorithm. In order to solve the problem, a feedback control scheme of algorithm parameters is adopted in cuckoo search; Rechenberg's 1/5 criterion, combined with a learning strategy, is used to evaluate the evolution process...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181878/using-inspiration-from-synaptic-plasticity-rules-to-optimize-traffic-flow-in-distributed-engineered-networks
#3
Jonathan Y Suen, Saket Navlakha
Controlling the flow and routing of data is a fundamental problem in many distributed networks, including transportation systems, integrated circuits, and the Internet. In the brain, synaptic plasticity rules have been discovered that regulate network activity in response to environmental inputs, which enable circuits to be stable yet flexible. Here, we develop a new neuro-inspired model for network flow control that depends only on modifying edge weights in an activity-dependent manner. We show how two fundamental plasticity rules, long-term potentiation and long-term depression, can be cast as a distributed gradient descent algorithm for regulating traffic flow in engineered networks...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181877/an-in-silico-biomarker-based-method-for-the-evaluation-of-virtual-neuropsychiatric-drug-effects
#4
Peter J Siekmeier
The recent explosion in neuroscience research has markedly increased our understanding of the neurobiological correlates of many psychiatric illnesses, but this has unfortunately not translated into more effective pharmacologic treatments for these conditions. At the same time, researchers have increasingly sought out biological markers, or biomarkers, as a way to categorize psychiatric illness, as these are felt to be closer to underlying genetic and neurobiological vulnerabilities. While biomarker-based drug discovery approaches have tended to employ in vivo (e...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181876/information-maximization-explains-the-sparseness-of-presynaptic-neural-response
#5
Minjoon Kouh
In a sensory neural network, where a population of presynaptic neurons sends information to a downstream neuron, maximizing information transmission depends on utilizing the full operating range of the output of the postsynaptic neuron. Because the convergence of presynaptic inputs naturally biases higher outputs, a sparse input distribution would counter such bias and optimize information transmission.
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181875/on-the-dynamical-interplay-of-positive-and-negative-affects
#6
Jonathan Touboul, Alberto Romagnoni, Robert Schwartz
Emotional disorders and psychological flourishing are the result of complex interactions between positive and negative affects that depend on external events and the subject's internal representations. Based on psychological data, we mathematically model the dynamical balance between positive and negative affects as a function of the response to external positive and negative events. This modeling allows the investigation of the relative impact of two leading forms of therapy on affect balance. The model uses a delay differential equation to analytically study the bifurcation diagram of the system...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28181874/evolving-network-model-that-almost-regenerates-epileptic-data
#7
G Manjunath
In many realistic networks, the edges representing the interactions between nodes are time varying. Evidence is growing that the complex network that models the dynamics of the human brain has time-varying interconnections, that is, the network is evolving. Based on this evidence, we construct a patient- and data-specific evolving network model (comprising discrete-time dynamical systems) in which epileptic seizures or their terminations in the brain are also determined by the nature of the time-varying interconnections between the nodes...
February 9, 2017: Neural Computation
https://www.readbyqxmd.com/read/28095203/avoiding-optimal-mean-%C3%A2-2-1-norm-maximization-based-robust-pca-for-reconstruction
#8
Minnan Luo, Feiping Nie, Xiaojun Chang, Yi Yang, Alexander G Hauptmann, Qinghua Zhang
Robust principal component analysis (PCA) is one of the most important dimension-reduction techniques for handling high-dimensional data with outliers. However, most of the existing robust PCA presupposes that the mean of the data is zero and incorrectly utilizes the average of data as the optimal mean of robust PCA. In fact, this assumption holds only for the squared [Formula: see text]-norm-based traditional PCA. In this letter, we equivalently reformulate the objective of conventional PCA and learn the optimal projection directions by maximizing the sum of projected difference between each pair of instances based on [Formula: see text]-norm...
January 17, 2017: Neural Computation
https://www.readbyqxmd.com/read/28095199/multiway-array-decomposition-of-eeg-spectrum-implications-of-its-stability-for-the-exploration-of-large-scale-brain-networks
#9
Mareček Radek, Martin Lamoš, René Labounek, Marek Bartoň, Tomáš Slavíček, Michal Mikl, Ivan Rektor, Milan Brázdil
Multiway array decomposition methods have been shown to be promising statistical tools for identifying neural activity in the EEG spectrum. They blindly decompose the EEG spectrum into spatial-temporal-spectral patterns by taking into account inherent relationships among signals acquired at different frequencies and sensors. Our study evaluates the stability of spatial-temporal-spectral patterns derived by one particular method, parallel factor analysis (PARAFAC). We focused on patterns' stability over time and in population and divided the complete data set containing data from 50 healthy subjects into several subsets...
January 17, 2017: Neural Computation
https://www.readbyqxmd.com/read/28095194/unifying-adversarial-training-algorithms-with-data-gradient-regularization
#10
Alexander G Ororbia Ii, Daniel Kifer, C Lee Giles
Many previous proposals for adversarial training of deep neural nets have included directly modifying the gradient, training on a mix of original and adversarial examples, using contractive penalties, and approximately optimizing constrained adversarial objective functions. In this article, we show that these proposals are actually all instances of optimizing a general, regularized objective we call DataGrad. Our proposed DataGrad framework, which can be viewed as a deep extension of the layerwise contractive autoencoder penalty, cleanly simplifies prior work and easily allows extensions such as adversarial training with multitask cues...
January 17, 2017: Neural Computation
https://www.readbyqxmd.com/read/28095193/semisupervised-multilabel-multi-instance-learning-for-structured-data
#11
Hossein Soleimani, David J Miller
Many classification tasks require both labeling objects and determining label associations for parts of each object. Example applications include labeling segments of images or determining relevant parts of a text document when the training labels are available only at the image or document level. This task is usually referred to as multi-instance (MI) learning, where the learner typically receives a collection of labeled (or sometimes unlabeled) bags, each containing several segments (instances). We propose a semisupervised MI learning method for multilabel classification...
January 17, 2017: Neural Computation
https://www.readbyqxmd.com/read/28095191/maximum-pseudolikelihood-estimation-for-model-based-clustering-of-time-series-data
#12
Hien D Nguyen, Geoffrey J McLachlan, Pierre Orban, Pierre Bellec, Andrew L Janke
Mixture of autoregressions (MoAR) models provide a model-based approach to the clustering of time series data. The maximum likelihood (ML) estimation of MoAR models requires evaluating products of large numbers of densities of normal random variables. In practical scenarios, these products converge to zero as the length of the time series increases, and thus the ML estimation of MoAR models becomes infeasible without the use of numerical tricks. We propose a maximum pseudolikelihood (MPL) estimation approach as an alternative to the use of numerical tricks...
January 17, 2017: Neural Computation
https://www.readbyqxmd.com/read/28095202/interpretation-of-the-precision-matrix-and-its-application-in-estimating-sparse-brain-connectivity-during-sleep-spindles-from-human-electrocorticography-recordings
#13
Anup Das, Aaron L Sampson, Claudia Lainscsek, Lyle Muller, Wutu Lin, John C Doyle, Sydney S Cash, Eric Halgren, Terrence J Sejnowski
The correlation method from brain imaging has been used to estimate functional connectivity in the human brain. However, brain regions might show very high correlation even when the two regions are not directly connected due to the strong interaction of the two regions with common input from a third region. One previously proposed solution to this problem is to use a sparse regularized inverse covariance matrix or precision matrix (SRPM) assuming that the connectivity structure is sparse. This method yields partial correlations to measure strong direct interactions between pairs of regions while simultaneously removing the influence of the rest of the regions, thus identifying regions that are conditionally independent...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095201/multisensory-bayesian-inference-depends-on-synapse-maturation-during-training-theoretical-analysis-and-neural-modeling-implementation
#14
Mauro Ursino, Cristiano Cuppini, Elisa Magosso
Recent theoretical and experimental studies suggest that in multisensory conditions, the brain performs a near-optimal Bayesian estimate of external events, giving more weight to the more reliable stimuli. However, the neural mechanisms responsible for this behavior, and its progressive maturation in a multisensory environment, are still insufficiently understood. The aim of this letter is to analyze this problem with a neural network model of audiovisual integration, based on probabilistic population coding-the idea that a population of neurons can encode probability functions to perform Bayesian inference...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095200/stdp-compatible-approximation-of-backpropagation-in-an-energy-based-model
#15
Yoshua Bengio, Thomas Mesnard, Asja Fischer, Saizheng Zhang, Yuhuai Wu
We show that Langevin Markov chain Monte Carlo inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similar to backpropagation. The backpropagated error is with respect to output units that have received an outside driving force pushing them away from the stationary point. Backpropagated error gradients correspond to temporal derivatives with respect to the activation of hidden units...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095198/effects-of-small-world-rewiring-probability-and-noisy-synaptic-conductivity-on-slow-waves-cortical-network
#16
Ramazan Tekin, Mehmet Emin Tagluk
Physiological rhythms play a critical role in the functional development of living beings. Many biological functions are executed with an interaction of rhythms produced by internal characteristics of scores of cells. While synchronized oscillations may be associated with normal brain functions, anomalies in these oscillations may cause or relate the emergence of some neurological or neuropsychological pathologies. This study was designed to investigate the effects of topological structure and synaptic conductivity noise on the spatial synchronization and temporal rhythmicity of the waves generated by cells in the network...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095197/solving-nonlinearly-separable-classifications-in-a-single-layer-neural-network
#17
Nolan Conaway, Kenneth J Kurtz
Since the work of Minsky and Papert ( 1969 ), it has been understood that single-layer neural networks cannot solve nonlinearly separable classifications (i.e., XOR). We describe and test a novel divergent autoassociative architecture capable of solving nonlinearly separable classifications with a single layer of weights. The proposed network consists of class-specific linear autoassociators. The power of the model comes from treating classification problems as within-class feature prediction rather than directly optimizing a discriminant function...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095196/analysis-of-online-composite-mirror-descent-algorithm
#18
Yunwen Lei, Ding-Xuan Zhou
We study the convergence of the online composite mirror descent algorithm, which involves a mirror map to reflect the geometry of the data and a convex objective function consisting of a loss and a regularizer possibly inducing sparsity. Our error analysis provides convergence rates in terms of properties of the strongly convex differentiable mirror map and the objective function. For a class of objective functions with Hölder continuous gradients, the convergence rates of the excess (regularized) risk under polynomially decaying step sizes have the order [Formula: see text] after [Formula: see text] iterates...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095195/deep-learning-with-dynamic-spiking-neurons-and-fixed-feedback-weights
#19
Arash Samadi, Timothy P Lillicrap, Douglas B Tweed
Recent work in computer science has shown the power of deep learning driven by the backpropagation algorithm in networks of artificial neurons. But real neurons in the brain are different from most of these artificial ones in at least three crucial ways: they emit spikes rather than graded outputs, their inputs and outputs are related dynamically rather than by piecewise-smooth functions, and they have no known way to coordinate arrays of synapses in separate forward and feedback pathways so that they change simultaneously and identically, as they do in backpropagation...
March 2017: Neural Computation
https://www.readbyqxmd.com/read/28095192/spike-centered-jitter-can-mistake-temporal-structure
#20
Jonathan Platkiewicz, Eran Stark, Asohan Amarasingham
Jitter-type spike resampling methods are routinely applied in neurophysiology for detecting temporal structure in spike trains (point processes). Several variations have been proposed. The concern has been raised, based on numerical experiments involving Poisson spike processes, that such procedures can be conservative. We study the issue and find it can be resolved by reemphasizing the distinction between spike-centered (basic) jitter and interval jitter. Focusing on spiking processes with no temporal structure, interval jitter generates an exact hypothesis test, guaranteeing valid conclusions...
March 2017: Neural Computation
journal
journal
31799
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"