journal
MENU ▼
Read by QxMD icon Read
search

Neural Networks: the Official Journal of the International Neural Network Society

journal
https://www.readbyqxmd.com/read/29328958/stdp-based-spiking-deep-convolutional-neural-networks-for-object-recognition
#1
Saeed Reza Kheradpisheh, Mohammad Ganjtabesh, Simon J Thorpe, Timothée Masquelier
Previous studies have shown that spike-timing-dependent plasticity (STDP) can be used in spiking neural networks (SNN) to extract visual features of low or intermediate complexity in an unsupervised manner. These studies, however, used relatively shallow architectures, and only one layer was trainable. Another line of research has demonstrated - using rate-based neural networks trained with back-propagation - that having many layers increases the recognition robustness, an approach known as deep learning. We thus designed a deep SNN, comprising several convolutional (trainable with STDP) and pooling layers...
December 23, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29306802/smoothing-inertial-projection-neural-network-for-minimization-lp-q-in-sparse-signal-reconstruction
#2
You Zhao, Xing He, Tingwen Huang, Junjian Huang
In this paper, we investigate a more general sparse signal recovery minimization model and a smoothing neural network optimal method for compress sensing problem, where the objective function is a Lp-q minimization model which includes nonsmooth, nonconvex, and non-Lipschitz quasi-norm Lp norms 1≥p>0 and nonsmooth Lq norms 2≥p>1, and its feasible set is a closed convex subset of Rn. Firstly, under the restricted isometry property (RIP) condition, the uniqueness of solution for the minimization model with a given sparsity s is obtained through the theoretical analysis...
December 20, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29272727/onartmap-a-fuzzy-artmap-based-architecture
#3
Alan L S Matias, Ajalmar R Rocha Neto
Fuzzy ARTMAP (FAM) copes with the stability-plasticity dilemma by the adaptive resonance theory (ART). Despite such an advantage, Fuzzy ARTMAP suffers from a category proliferation problem, which leads to a high number of categories and a decrease in performance for unseen patterns. Such drawbacks are mainly caused by the overlapping region (noise) between classes. To overcome these drawbacks, we propose a Fuzzy ARTMAP-based architecture robust to noise, named OnARTMAP, for both online and batch learning. Our neural networks (OnARTMAP1 and OnARTMAP2) proposed for batch learning have a two-stage learning process, while our neural network (OnARTMAPo) for online and incremental learning has just a single iterative process...
December 19, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29306800/multistability-and-multiperiodicity-in-impulsive-hybrid-quaternion-valued-neural-networks-with-mixed-delays
#4
Călin-Adrian Popa, Eva Kaslik
The existence of multiple exponentially stable equilibrium states and periodic solutions is investigated for Hopfield-type quaternion-valued neural networks (QVNNs) with impulsive effects and both time-dependent and distributed delays. Employing Brouwer's and Leray-Schauder's fixed point theorems, suitable Lyapunov functionals and impulsive control theory, sufficient conditions are given for the existence of 16n attractors, showing a substantial improvement in storage capacity, compared to real-valued or complex-valued neural networks...
December 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29301111/a-deep-learning-framework-for-causal-shape-transformation
#5
Kin Gwn Lore, Daniel Stoecklein, Michael Davies, Baskar Ganapathysubramanian, Soumik Sarkar
Recurrent neural network (RNN) and Long Short-term Memory (LSTM) networks are the common go-to architecture for exploiting sequential information where the output is dependent on a sequence of inputs. However, in most considered problems, the dependencies typically lie in the latent domain which may not be suitable for applications involving the prediction of a step-wise transformation sequence that is dependent on the previous states only in the visible domain with a known terminal state. We propose a hybrid architecture of convolution neural networks (CNN) and stacked autoencoders (SAE) to learn a sequence of causal actions that nonlinearly transform an input visual pattern or distribution into a target visual pattern or distribution with the same support and demonstrated its practicality in a real-world engineering problem involving the physics of fluids...
December 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29301110/on-the-approximation-by-single-hidden-layer-feedforward-neural-networks-with-fixed-weights
#6
Namig J Guliyev, Vugar E Ismailov
Single hidden layer feedforward neural networks (SLFNs) with fixed weights possess the universal approximation property provided that approximated functions are univariate. But this phenomenon does not lay any restrictions on the number of neurons in the hidden layer. The more this number, the more the probability of the considered network to give precise results. In this note, we constructively prove that SLFNs with the fixed weight 1 and two neurons in the hidden layer can approximate any continuous function on a compact subset of the real line...
December 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29306801/self-learning-robust-optimal-control-for-continuous-time-nonlinear-systems-with-mismatched-disturbances
#7
Xiong Yang, Haibo He
This paper presents a novel adaptive dynamic programming(ADP)-based self-learning robust optimal control scheme for input-affine continuous-time nonlinear systems with mismatched disturbances. First, the stabilizing feedback controller for original nonlinear systems is designed by modifying the optimal control law of the auxiliary system. It is also demonstrated that this feedback controller can optimize a specified value function. Then, within the framework of ADP, a single critic network is constructed to solve the Hamilton-Jacobi-Bellman equation associated with the auxiliary system optimal control law...
December 13, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29288874/modeling-spike-wave-discharges-by-a-complex-network-of-neuronal-oscillators
#8
Tatiana M Medvedeva, Marina V Sysoeva, Gilles van Luijtelaar, Ilya V Sysoev
PURPOSE: The organization of neural networks and the mechanisms, which generate the highly stereotypical for absence epilepsy spike-wave discharges (SWDs) is heavily debated. Here we describe such a model which can both reproduce the characteristics of SWDs and dynamics of coupling between brain regions, relying mainly on properties of hierarchically organized networks of a large number of neuronal oscillators. MODEL: We used a two level mesoscale model. The first level consists of three structures: the nervus trigeminus serving as an input, the thalamus and the somatosensory cortex; the second level of a group of nearby situated neurons belonging to one of three modeled structures...
December 13, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29306803/computational-study-of-depth-completion-consistent-with-human-bi-stable-perception-for-ambiguous-figures
#9
Eiichi Mitsukura, Shunji Satoh
We propose a computational model that is consistent with human perception of depth in "ambiguous regions," in which no binocular disparity exists. Results obtained from our model reveal a new characteristic of depth perception. Random dot stereograms (RDS) are often used as examples because RDS provides sufficient disparity for depth calculation. A simple question confronts us: "How can we estimate the depth of a no-texture image region, such as one on white paper?" In such ambiguous regions, mathematical solutions related to binocular disparities are not unique or indefinite...
December 12, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29288873/necessary-and-sufficient-conditions-of-proper-estimators-based-on-self-density-ratio-for-unnormalized-statistical-models
#10
Kazuyuki Hiraoka, Toshihiko Hamada, Gen Hori
The largest family of density-ratio based estimators is obtained for unnormalized statistical models under the assumption of properness. They do not require normalization of the probability density function (PDF) because they are based on the density ratio of the same PDF at different points; therefore, the multiplicative normalization constant cancels out. In contrast with most existing work, a single necessary and sufficient condition is given here, rather than merely sufficient conditions for proper criteria for estimation...
December 11, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29232616/towards-understanding-sparse-filtering-a-theoretical-perspective
#11
Fabio Massimo Zennaro, Ke Chen
In this paper we present a theoretical analysis to understand sparse filtering, a recent and effective algorithm for unsupervised learning. The aim of this research is not to show whether or how well sparse filtering works, but to understand why and when sparse filtering does work. We provide a thorough theoretical analysis of sparse filtering and its properties, and further offer an experimental validation of the main outcomes of our theoretical analysis. We show that sparse filtering works by explicitly maximizing the entropy of the learned representations through the maximization of the proxy of sparsity, and by implicitly preserving mutual information between original and learned representations through the constraint of preserving a structure of the data...
December 9, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29306756/a-loop-based-neural-architecture-for-structured-behavior-encoding-and-decoding
#12
Thomas Gisiger, Mounir Boukadoum
We present a new type of artificial neural network that generalizes on anatomical and dynamical aspects of the mammal brain. Its main novelty lies in its topological structure which is built as an array of interacting elementary motifs shaped like loops. These loops come in various types and can implement functions such as gating, inhibitory or executive control, or encoding of task elements to name a few. Each loop features two sets of neurons and a control region, linked together by non-recurrent projections...
December 8, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29291546/nonlinear-predictive-control-for-adaptive-adjustments-of-deep-brain-stimulation-parameters-in-basal-ganglia-thalamic-network
#13
Fei Su, Jiang Wang, Shuangxia Niu, Huiyan Li, Bin Deng, Chen Liu, Xile Wei
The efficacy of deep brain stimulation (DBS) for Parkinson's disease (PD) depends in part on the post-operative programming of stimulation parameters. Closed-loop stimulation is one method to realize the frequent adjustment of stimulation parameters. This paper introduced the nonlinear predictive control method into the online adjustment of DBS amplitude and frequency. This approach was tested in a computational model of basal ganglia-thalamic network. The autoregressive Volterra model was used to identify the process model based on physiological data...
December 7, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29274499/impact-of-leakage-delay-on-bifurcation-in-high-order-fractional-bam-neural-networks
#14
Chengdai Huang, Jinde Cao
The effects of leakage delay on the dynamics of neural networks with integer-order have lately been received considerable attention. It has been confirmed that fractional neural networks more appropriately uncover the dynamical properties of neural networks, but the results of fractional neural networks with leakage delay are relatively few. This paper primarily concentrates on the issue of bifurcation for high-order fractional bidirectional associative memory(BAM) neural networks involving leakage delay. The first attempt is made to tackle the stability and bifurcation of high-order fractional BAM neural networks with time delay in leakage terms in this paper...
December 6, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29272726/manifold-optimization-based-analysis-dictionary-learning-with-an-%C3%A2-1%C3%A2-2-norm-regularizer
#15
Zhenni Li, Shuxue Ding, Yujie Li, Zuyuan Yang, Shengli Xie, Wuhui Chen
Recently there has been increasing attention towards analysis dictionary learning. In analysis dictionary learning, it is an open problem to obtain the strong sparsity-promoting solutions efficiently while simultaneously avoiding the trivial solutions of the dictionary. In this paper, to obtain the strong sparsity-promoting solutions, we employ the ℓ1∕2 norm as a regularizer. The very recent study on ℓ1∕2 norm regularization theory in compressive sensing shows that its solutions can give sparser results than using the ℓ1 norm...
December 6, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29287188/standard-representation-and-unified-stability-analysis-for-dynamic-artificial-neural-network-models
#16
Kwang-Ki K Kim, Ernesto Ríos Patrón, Richard D Braatz
An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors...
December 2, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29268197/fixed-time-stabilization-of-impulsive-cohen-grossberg-bam-neural-networks
#17
Hongfei Li, Chuandong Li, Tingwen Huang, Wanli Zhang
This article is concerned with the fixed-time stabilization for impulsive Cohen-Grossberg BAM neural networks via two different controllers. By using a novel constructive approach based on some comparison techniques for differential inequalities, an improvement theorem of fixed-time stability for impulsive dynamical systems is established. In addition, based on the fixed-time stability theorem of impulsive dynamical systems, two different control protocols are designed to ensure the fixed-time stabilization of impulsive Cohen-Grossberg BAM neural networks, which include and extend the earlier works...
December 2, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29245057/a-new-randomized-kaczmarz-based-kernel-canonical-correlation-analysis-algorithm-with-applications-to-information-retrieval
#18
Jia Cai, Yi Tang
Canonical correlation analysis (CCA) is a powerful statistical tool for detecting the linear relationship between two sets of multivariate variables. Kernel generalization of it, namely, kernel CCA is proposed to describe nonlinear relationship between two variables. Although kernel CCA can achieve dimensionality reduction results for high-dimensional data feature selection problem, it also yields the so called over-fitting phenomenon. In this paper, we consider a new kernel CCA algorithm via randomized Kaczmarz method...
December 2, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29223869/nonlinear-recurrent-neural-networks-for-finite-time-solution-of-general-time-varying-linear-matrix-equations
#19
Lin Xiao, Bolin Liao, Shuai Li, Ke Chen
In order to solve general time-varying linear matrix equations (LMEs) more efficiently, this paper proposes two nonlinear recurrent neural networks based on two nonlinear activation functions. According to Lyapunov theory, such two nonlinear recurrent neural networks are proved to be convergent within finite-time. Besides, by solving differential equation, the upper bounds of the finite convergence time are determined analytically. Compared with existing recurrent neural networks, the proposed two nonlinear recurrent neural networks have a better convergence property (i...
December 2, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29268196/pth-moment-exponential-stability-of-stochastic-memristor-based-bidirectional-associative-memory-bam-neural-networks-with-time-delays
#20
Fen Wang, Yuanlong Chen, Meichun Liu
Stochastic memristor-based bidirectional associative memory (BAM) neural networks with time delays play an increasingly important role in the design and implementation of neural network systems. Under the framework of Filippov solutions, the issues of the pth moment exponential stability of stochastic memristor-based BAM neural networks are investigated. By using the stochastic stability theory, Itô's differential formula and Young inequality, the criteria are derived. Meanwhile, with Lyapunov approach and Cauchy-Schwarz inequality, we derive some sufficient conditions for the mean square exponential stability of the above systems...
November 24, 2017: Neural Networks: the Official Journal of the International Neural Network Society
journal
journal
29823
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"