journal
MENU ▼
Read by QxMD icon Read
search

Neural Networks: the Official Journal of the International Neural Network Society

journal
https://www.readbyqxmd.com/read/29940489/on-the-importance-of-hidden-bias-and-hidden-entropy-in-representational-efficiency-of-the-gaussian-bipolar-restricted-boltzmann-machines
#1
Altynbek Isabekov, Engin Erzin
In this paper, we analyze the role of hidden bias in representational efficiency of the Gaussian-Bipolar Restricted Boltzmann Machines (GBPRBMs), which are similar to the widely used Gaussian-Bernoulli RBMs. Our experiments show that hidden bias plays an important role in shaping of the probability density function of the visible units. We define hidden entropy and propose it as a measure of representational efficiency of the model. By using this measure, we investigate the effect of hidden bias on the hidden entropy and provide a full analysis of the hidden entropy as function of the hidden bias for small models with up to three hidden units...
June 22, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29990758/new-conditions-for-global-stability-of-neutral-type-delayed-cohen-grossberg-neural-networks
#2
Neyir Ozcan
This paper carries out a theoretical investigation of the class of neutral-type delayed Cohen-Grossberg neural networks by using the Lyapunov stability theory. By employing a suitable Lyapunov functional candidate, we derive some new delay independent sufficient conditions for the global asymptotic stability of the equilibrium point for the neutral-type Cohen-Grossberg neural networks with time delays. The obtained stability conditions can be completely characterized by the networks parameters of the neutral systems under consideration...
June 21, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29945062/nonparallel-support-vector-regression-model-and-its-smo-type-solver
#3
Long Tang, Yingjie Tian, Chunyan Yang
Although the twin support vector regression (TSVR) method has been widely studied and various variants are successfully developed, the structural risk minimization (SRM) principle and model's sparseness are not given sufficient consideration. In this paper, a novel nonparallel support vector regression (NPSVR) is proposed in spirit of nonparallel support vector machine (NPSVM), which outperforms existing twin support vector regression (TSVR) methods in the following terms: (1) For each primal problem, a regularized term is added by rigidly following the SRM principle so that the kernel trick can be applied directly to the dual problems for the nonlinear case without considering an extra kernel-generated surface; (2) An ε-insensitive loss function is adopted to remain inherent sparseness as the standard support vector regression (SVR); (3) The dual problems have the same formulation with that of the standard SVR, so computing inverse matrix is well avoided and a sequential minimization optimization (SMO)-type solver is exclusively designed to accelerate the training for large-scale datasets; (4) The primal problems can approximately degenerate to those of the existing TSVRs if corresponding parameters are appropriately chosen...
June 19, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29940488/lp-and-ls-norm-distance-based-robust-linear-discriminant-analysis
#4
Qiaolin Ye, Liyong Fu, Zhao Zhang, Henghao Zhao, Meem Naiem
Recently, L1-norm distance measure based Linear Discriminant Analysis (LDA) techniques have been shown to be robust against outliers. However, these methods have no guarantee of obtaining a satisfactory-enough performance due to the insufficient robustness of L1-norm measure. To mitigate this problem, inspired by recent works on Lp-norm based learning, this paper proposes a new discriminant method, called Lp- and Ls-Norm Distance Based Robust Linear Discriminant Analysis (FLDA-Lsp). The proposed method achieves robustness by replacing the L2-norm within- and between-class distances in conventional LDA with Lp- and Ls-norm ones...
June 15, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29940487/discovering-space-grounding-spatial-topology-and-metric-regularity-in-a-naive-agent-s-sensorimotor-experience
#5
Alban Laflaquière, J Kevin O'Regan, Bruno Gas, Alexander Terekhov
In line with the sensorimotor contingency theory, we investigate the problem of the perception of space from a fundamental sensorimotor perspective. Despite its pervasive nature in our perception of the world, the origin of the concept of space remains largely mysterious. For example in the context of artificial perception, this issue is usually circumvented by having engineers pre-define the spatial structure of the problem the agent has to face. We here show that the structure of space can be autonomously discovered by a naive agent in the form of sensorimotor regularities, that correspond to so called compensable sensory experiences: these are experiences that can be generated either by the agent or its environment...
June 15, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29929102/a-sparsity-based-stochastic-pooling-mechanism-for-deep-convolutional-neural-networks
#6
Zhenhua Song, Yan Liu, Rong Song, Zhenguang Chen, Jianyong Yang, Chao Zhang, Qing Jiang
A novel sparsity-based stochastic pooling which integrates the advantages of max-pooling, average-pooling and stochastic pooling is introduced. The proposed pooling is designed to balance the advantages and disadvantages of max-pooling and average-pooling by using the degree of sparsity of activations and a control function to obtain an optimized representative feature value ranging from average value to maximum value of a pooling region. The optimized representative feature value is employed for probability weights assignment of activations in normal distribution...
June 15, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29945061/exploiting-layerwise-convexity-of-rectifier-networks-with-sign-constrained-weights
#7
Senjian An, Farid Boussaid, Mohammed Bennamoun, Ferdous Sohel
By introducing sign constraints on the weights, this paper proposes sign constrained rectifier networks (SCRNs), whose training can be solved efficiently by the well known majorization-minimization (MM) algorithms. We prove that the proposed two-hidden-layer SCRNs, which exhibit negative weights in the second hidden layer and negative weights in the output layer, are capable of separating any number of disjoint pattern sets. Furthermore, the proposed two-hidden-layer SCRNs can decompose the patterns of each class into several clusters so that each cluster is convexly separable from all the patterns from the other classes...
June 13, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29890384/global-exponential-stability-of-octonion-valued-neural-networks-with-leakage-delay-and-mixed-delays
#8
Călin-Adrian Popa
This paper discusses octonion-valued neural networks (OVNNs) with leakage delay, time-varying delays, and distributed delays, for which the states, weights, and activation functions belong to the normed division algebra of octonions. The octonion algebra is a nonassociative and noncommutative generalization of the complex and quaternion algebras, but does not belong to the category of Clifford algebras, which are associative. In order to avoid the nonassociativity of the octonion algebra and also the noncommutativity of the quaternion algebra, the Cayley-Dickson construction is used to decompose the OVNNs into 4 complex-valued systems...
June 8, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29920430/improving-efficiency-in-convolutional-neural-networks-with-multilinear-filters
#9
Dat Thanh Tran, Alexandros Iosifidis, Moncef Gabbouj
The excellent performance of deep neural networks has enabled us to solve several automatization problems, opening an era of autonomous devices. However, current deep net architectures are heavy with millions of parameters and require billions of floating point operations. Several works have been developed to compress a pre-trained deep network to reduce memory footprint and, possibly, computation. Instead of compressing a pre-trained network, in this work, we propose a generic neural network layer structure employing multilinear projection as the primary feature extractor...
June 7, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29886328/finite-time-synchronization-of-stochastic-coupled-neural-networks-subject-to-markovian-switching-and-input-saturation
#10
P Selvaraj, R Sakthivel, O M Kwon
This paper addresses the problem of finite-time synchronization of stochastic coupled neural networks (SCNNs) subject to Markovian switching, mixed time delay, and actuator saturation. In addition, coupling strengths of the SCNNs are characterized by mutually independent random variables. By utilizing a simple linear transformation, the problem of stochastic finite-time synchronization of SCNNs is converted into a mean-square finite-time stabilization problem of an error system. By choosing a suitable mode dependent switched Lyapunov-Krasovskii functional, a new set of sufficient conditions is derived to guarantee the finite-time stability of the error system...
June 7, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29886004/corrigendum-to-hopfield-networks-as-a-model-of-prototype-based-category-learning-a-method-to-distinguish-trained-spurious-and-prototypical-attractors-neural-netw-91-2017-76-84
#11
Chris Gorman, Anthony Robins, Alistair Knott
No abstract text is available yet for this article.
June 6, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29933156/land-cover-classification-from-multi-temporal-multi-spectral-remotely-sensed-imagery-using-patch-based-recurrent-neural-networks
#12
LETTER
Atharva Sharma, Xiuwen Liu, Xiaojun Yang
Environmental sustainability research is dependent on accurate land cover information. Even with the increased number of satellite systems and sensors acquiring data with improved spectral, spatial, radiometric and temporal characteristics and the new data distribution policy, most existing land cover datasets are derived from a pixel-based, single-date multi-spectral remotely sensed image with an unacceptable accuracy. One major bottleneck for accuracy improvement is how to develop an accurate and effective image classification protocol...
June 2, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29894846/representation-learning-using-event-based-stdp
#13
Amirhossein Tavanaei, Timothée Masquelier, Anthony Maida
Although representation learning methods developed within the framework of traditional neural networks are relatively mature, developing a spiking representation model remains a challenging problem. This paper proposes an event-based method to train a feedforward spiking neural network (SNN) layer for extracting visual features. The method introduces a novel spike-timing-dependent plasticity (STDP) learning rule and a threshold adjustment rule both derived from a vector quantization-like objective function subject to a sparsity constraint...
June 1, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29890383/adaptive-neural-output-feedback-control-for-nonstrict-feedback-time-delay-fractional-order-systems-with-output-constraints-and-actuator-nonlinearities
#14
Farouk Zouari, Asier Ibeas, Abdesselem Boulkroune, Jinde Cao, Mohammad Mehdi Arefi
This study addresses the issue of the adaptive output tracking control for a category of uncertain nonstrict-feedback delayed incommensurate fractional-order systems in the presence of nonaffine structures, unmeasured pseudo-states, unknown control directions, unknown actuator nonlinearities and output constraints. Firstly, the mean value theorem and the Gaussian error function are introduced to eliminate the difficulties that arise from the nonaffine structures and the unknown actuator nonlinearities, respectively...
June 1, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29894847/nonlinear-analysis-and-synthesis-of-video-images-using-deep-dynamic-bottleneck-neural-networks-for-face-recognition
#15
Saeed Montazeri Moghadam, Seyyed Ali Seyyedsalehi
Nonlinear components extracted from deep structures of bottleneck neural networks exhibit a great ability to express input space in a low-dimensional manifold. Sharing and combining the components boost the capability of the neural networks to synthesize and interpolate new and imaginary data. This synthesis is possibly a simple model of imaginations in human brain where the components are expressed in a nonlinear low dimensional manifold. The current paper introduces a novel Dynamic Deep Bottleneck Neural Network to analyze and extract three main features of videos regarding the expression of emotions on the face...
May 31, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29909147/leaderless-synchronization-of-coupled-neural-networks-with-the-event-triggered-mechanism
#16
Siqi Lv, Wangli He, Feng Qian, Jinde Cao
This paper is concerned with leaderless synchronization of coupled delayed neural networks. A distributed event-triggered control strategy under the periodic sampling scheme is introduced to reduce control updates. By introducing a weighted average state as a virtual leader, the leaderless synchronization problem can be transformed to the stability problem of the error system, which is defined as the distance between each node and the virtual leader. A leaderless synchronization criterion under the periodic event-triggered scheme in strongly connected networks is first derived based on Finsler's lemma...
May 30, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29883852/delayed-state-feedback-control-for-stabilization-of-neural-networks-with-leakage-delay
#17
Haitao Zhu, R Rakkiyappan, Xiaodi Li
This paper mainly deals with the problem of designing delayed state-feedback controller for neural networks with leakage delay. By constructing an appropriate Lyapunov-Krasovskii functional including double integral terms having two different exponential decay rates and utilizing linear matrix inequality (LMI) technique with the help of slack variables, some sufficient conditions for globally exponential stabilization results are obtained by designing of delayed state-feedback controller. The novelty of this paper includes: (i) although many papers dealt with dynamics of neural networks with leakage delay, there is little work on design of feedback controller...
May 28, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29843095/adaptive-critic-designs-for-optimal-control-of-uncertain-nonlinear-systems-with-unmatched-interconnections
#18
Xiong Yang, Haibo He
In this paper, we develop a novel optimal control strategy for a class of uncertain nonlinear systems with unmatched interconnections. To begin with, we present a stabilizing feedback controller for the interconnected nonlinear systems by modifying an array of optimal control laws of auxiliary subsystems. We also prove that this feedback controller ensures a specified cost function to achieve optimality. Then, under the framework of adaptive critic designs, we use critic networks to solve the Hamilton-Jacobi-Bellman equations associated with auxiliary subsystem optimal control laws...
May 26, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29870926/design-verification-and-robotic-application-of-a-novel-recurrent-neural-network-for-computing-dynamic-sylvester-equation
#19
Lin Xiao, Zhijun Zhang, Zili Zhang, Weibing Li, Shuai Li
To solve dynamic Sylvester equation in the presence of additive noises, a novel recurrent neural network (NRNN) with finite-time convergence and excellent robustness is proposed and analyzed in this paper. As compared with the design process of Zhang neural network (ZNN), the proposed NRNN is based on an ingenious integral design formula activated by nonlinear functions, which are able to expedite the convergence speed and suppress unknown additive noises during the solving process of dynamic Sylvester equation...
May 24, 2018: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29804041/convex-formulation-of-multiple-instance-learning-from-positive-and-unlabeled-bags
#20
Han Bao, Tomoya Sakai, Issei Sato, Masashi Sugiyama
Multiple instance learning (MIL) is a variation of traditional supervised learning problems where data (referred to as bags) are composed of sub-elements (referred to as instances) and only bag labels are available. MIL has a variety of applications such as content-based image retrieval, text categorization, and medical diagnosis. Most of the previous work for MIL assume that training bags are fully labeled. However, it is often difficult to obtain an enough number of labeled bags in practical situations, while many unlabeled bags are available...
May 24, 2018: Neural Networks: the Official Journal of the International Neural Network Society
journal
journal
29823
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"