keyword
MENU ▼
Read by QxMD icon Read
search

hopfield network

keyword
https://www.readbyqxmd.com/read/28646762/hybrid-impulsive-and-switching-hopfield-neural-networks-with-state-dependent-impulses
#1
Xianxiu Zhang, Chuandong Li, Tingwen Huang
We discuss the global stability of switching Hopfield neural networks (HNN) with state-dependent impulses using B-equivalence method. Under certain conditions, we show that the state-dependent impulsive switching systems can be reduced to the fixed-time ones, and that the global stability of corresponding comparison system implies the same stability of the considered system. On this basis, a novel stability criterion for the considered HNN is established. Finally, two numerical examples are given to demonstrate the effectiveness of our results...
May 24, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28627811/energy-based-neural-networks-as-a-tool-for-harmony-based-virtual-screening
#2
Nelly I Zhokhova, Igor I Baskin
In Energy-Based Neural Networks (EBNNs), relationships between variables are captured by means of a scalar function conventionally called "energy". In this article, we introduce a procedure of "harmony search", which looks for compounds providing the lowest energies for the EBNNs trained on active compounds. It can be considered as a special kind of similarity search that takes into account regularities in the structures of active compounds. In this paper, we show that harmony search can be used for performing virtual screening...
June 19, 2017: Molecular Informatics
https://www.readbyqxmd.com/read/28553351/fast-recall-for-complex-valued-hopfield-neural-networks-with-projection-rules
#3
Masaki Kobayashi
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima...
2017: Computational Intelligence and Neuroscience
https://www.readbyqxmd.com/read/28494329/hopfield-networks-as-a-model-of-prototype-based-category-learning-a-method-to-distinguish-trained-spurious-and-prototypical-attractors
#4
Chris Gorman, Anthony Robins, Alistair Knott
We present an investigation of the potential use of Hopfield networks to learn neurally plausible, distributed representations of category prototypes. Hopfield networks are dynamical models of autoassociative memory which learn to recreate a set of input states from any given starting state. These networks, however, will almost always learn states which were not presented during training, so called spurious states. Historically, spurious states have been an undesirable side-effect of training a Hopfield network and there has been much research into detecting and discarding these unwanted states...
April 25, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28489551/on-the-dynamics-of-hopfield-neural-networks-on-unit-quaternions
#5
Marcos Eduardo Valle, Fidelis Zanetti de Castro
In this paper, we first address the dynamics of the elegant multivalued quaternionic Hopfield neural network (MV-QHNN) proposed by Minemoto et al. Contrary to what was expected, we show that the MV-QHNN, as well as one of its variation, does not always come to rest at an equilibrium state under the usual conditions. In fact, we provide simple examples in which the network yields a periodic sequence of quaternionic state vectors. Afterward, we turn our attention to the continuous-valued quaternionic Hopfield neural network (CV-QHNN), which can be derived from the MV-QHNN by means of a limit process...
May 5, 2017: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/28458684/modeling-the-attractor-landscape-of-disease-progression-a-network-based-approach
#6
Atefeh Taherian Fard, Mark A Ragan
Genome-wide regulatory networks enable cells to function, develop, and survive. Perturbation of these networks can lead to appearance of a disease phenotype. Inspired by Conrad Waddington's epigenetic landscape of cell development, we use a Hopfield network formalism to construct an attractor landscape model of disease progression based on protein- or gene-correlation networks of Parkinson's disease, glioma, and colorectal cancer. Attractors in this landscape correspond to normal and disease states of the cell...
2017: Frontiers in Genetics
https://www.readbyqxmd.com/read/28422696/new-conditions-for-global-asymptotic-stability-of-memristor-neural-networks
#7
Mauro Di Marco, Mauro Forti, Luca Pancioni
Recent papers in the literature introduced a class of neural networks (NNs) with memristors, named dynamic-memristor (DM) NNs, such that the analog processing takes place in the charge-flux domain, instead of the typical current-voltage domain as it happens for Hopfield NNs and standard cellular NNs. One key advantage is that, when a steady state is reached, all currents, voltages, and power of a DM-NN drop off, whereas the memristors act as nonvolatile memories that store the processing result. Previous work in the literature addressed multistability of DM-NNs, i...
April 12, 2017: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/28369106/density-based-clustering-a-landscape-view-of-multi-channel-neural-data-for-inference-and-dynamic-complexity-analysis
#8
Gabriel Baglietto, Guido Gigante, Paolo Del Giudice
Two, partially interwoven, hot topics in the analysis and statistical modeling of neural data, are the development of efficient and informative representations of the time series derived from multiple neural recordings, and the extraction of information about the connectivity structure of the underlying neural network from the recorded neural activities. In the present paper we show that state-space clustering can provide an easy and effective option for reducing the dimensionality of multiple neural time series, that it can improve inference of synaptic couplings from neural activities, and that it can also allow the construction of a compact representation of the multi-dimensional dynamics, that easily lends itself to complexity measures...
2017: PloS One
https://www.readbyqxmd.com/read/28297857/mean-field-message-passing-equations-in-the-hopfield-model-and-its-generalizations
#9
Marc M├ęzard
Motivated by recent progress in using restricted Boltzmann machines as preprocessing algorithms for deep neural network, we revisit the mean-field equations [belief-propagation and Thouless-Anderson Palmer (TAP) equations] in the best understood of such machines, namely the Hopfield model of neural networks, and we explicit how they can be used as iterative message-passing algorithms, providing a fast method to compute the local polarizations of neurons. In the "retrieval phase", where neurons polarize in the direction of one memorized pattern, we point out a major difference between the belief propagation and TAP equations: The set of belief propagation equations depends on the pattern which is retrieved, while one can use a unique set of TAP equations...
February 2017: Physical Review. E
https://www.readbyqxmd.com/read/28208341/effect-of-similarity-between-patterns-in-associative-memory
#10
Sheng-Jun Wang, Zhou Yang
We study the stability of patterns in Hopfield networks in which a part of memorized patterns are similar. The similarity between patterns impacts the stability of these patterns, but the stability of other independent patterns is only changed slightly. We show that the stability of patterns is affected in different ways by similarity. For networks storing a number of patterns, the similarity between patterns enhances the pattern stability. However, the stability of patterns can be weakened by the similarity when networks store fewer patterns, and the relation between the stability of patterns and similarity is nonmonotonic...
January 2017: Physical Review. E
https://www.readbyqxmd.com/read/28182561/decomposition-of-rotor-hopfield-neural-networks-using-complex-numbers
#11
Masaki Kobayashi
A complex-valued Hopfield neural network (CHNN) is a multistate model of a Hopfield neural network. It has the disadvantage of low noise tolerance. Meanwhile, a symmetric CHNN (SCHNN) is a modification of a CHNN that improves noise tolerance. Furthermore, a rotor Hopfield neural network (RHNN) is an extension of a CHNN. It has twice the storage capacity of CHNNs and SCHNNs, and much better noise tolerance than CHNNs, although it requires twice many connection parameters. In this brief, we investigate the relations between CHNN, SCHNN, and RHNN; an RHNN is uniquely decomposed into a CHNN and SCHNN...
February 2, 2017: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/28182559/quantum-inspired-multidirectional-associative-memory-with-a-self-convergent-iterative-learning
#12
Naoki Masuyama, Chu Kiong Loo, Manjeevan Seera, Naoyuki Kubota
Quantum-inspired computing is an emerging research area, which has significantly improved the capabilities of conventional algorithms. In general, quantum-inspired hopfield associative memory (QHAM) has demonstrated quantum information processing in neural structures. This has resulted in an exponential increase in storage capacity while explaining the extensive memory, and it has the potential to illustrate the dynamics of neurons in the human brain when viewed from quantum mechanics perspective although the application of QHAM is limited as an autoassociation...
February 6, 2017: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/28174616/optimal-path-finding-through-mental-exploration-based-on-neural-energy-field-gradients
#13
Yihong Wang, Rubin Wang, Yating Zhu
Rodent animal can accomplish self-locating and path-finding task by forming a cognitive map in the hippocampus representing the environment. In the classical model of the cognitive map, the system (artificial animal) needs large amounts of physical exploration to study spatial environment to solve path-finding problems, which costs too much time and energy. Although Hopfield's mental exploration model makes up for the deficiency mentioned above, the path is still not efficient enough. Moreover, his model mainly focused on the artificial neural network, and clear physiological meanings has not been addressed...
February 2017: Cognitive Neurodynamics
https://www.readbyqxmd.com/read/28113645/quantized-synchronization-of-chaotic-neural-networks-with-scheduled-output-feedback-control
#14
Ying Wan, Jinde Cao, Guanghui Wen
In this paper, the synchronization problem of master-slave chaotic neural networks with remote sensors, quantization process, and communication time delays is investigated. The information communication channel between the master chaotic neural network and slave chaotic neural network consists of several remote sensors, with each sensor able to access only partial knowledge of output information of the master neural network. At each sampling instants, each sensor updates its own measurement and only one sensor is scheduled to transmit its latest information to the controller's side in order to update the control inputs for the slave neural network...
August 24, 2016: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/28055918/stability-of-rotor-hopfield-neural-networks-with-synchronous-mode
#15
Masaki Kobayashi
A complex-valued Hopfield neural network (CHNN) is a model of a Hopfield neural network using multistate neurons. The stability conditions of CHNNs have been widely studied. A CHNN with a synchronous mode will converge to a fixed point or a cycle of length 2. A rotor Hopfield neural network (RHNN) is also a model of a multistate Hopfield neural network. RHNNs have much higher storage capacity and noise tolerance than CHNNs. We extend the theories regarding the stability of CHNNs to RHNNs. In addition, we investigate the stability of RHNNs with the projection rule...
December 29, 2016: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/28018202/further-work-on-the-shaping-of-cortical-development-and-function-by-synchrony-and-metabolic-competition
#16
James J Wright, Paul D Bourke
This paper furthers our attempts to resolve two major controversies-whether gamma synchrony plays a role in cognition, and whether cortical columns are functionally important. We have previously argued that the configuration of cortical cells that emerges in development is that which maximizes the magnitude of synchronous oscillation and minimizes metabolic cost. Here we analyze the separate effects in development of minimization of axonal lengths, and of early Hebbian learning and selective distribution of resources to growing synapses, by showing in simulations that these effects are partially antagonistic, but their interaction during development produces accurate anatomical and functional properties for both columnar and non-columnar cortex...
2016: Frontiers in Computational Neuroscience
https://www.readbyqxmd.com/read/27893400/multivariate-cryptography-based-on-clipped-hopfield-neural-network
#17
Jia Wang, Lee-Ming Cheng, Tong Su
Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in GF(p) space. The Diffie--Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography...
November 23, 2016: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/27891202/new-delay-interval-dependent-stability-criteria-for-switched-hopfield-neural-networks-of-neutral-type-with-successive-time-varying-delay-components
#18
R Manivannan, R Samidurai, Jinde Cao, Ahmed Alsaedi
This paper deals with the problem of delay-interval-dependent stability criteria for switched Hopfield neural networks of neutral type with successive time-varying delay components. A novel Lyapunov-Krasovskii (L-K) functionals with triple integral terms which involves more information on the state vectors of the neural networks and upper bound of the successive time-varying delays is constructed. By using the famous Jensen's inequality, Wirtinger double integral inequality, introducing of some zero equations and using the reciprocal convex combination technique and Finsler's lemma, a novel delay-interval dependent stability criterion is derived in terms of linear matrix inequalities, which can be efficiently solved via standard numerical software...
December 2016: Cognitive Neurodynamics
https://www.readbyqxmd.com/read/27870611/exponentially-long-orbits-in-hopfield-neural-networks
#19
Samuel P Muscinelli, Wulfram Gerstner, Johanni Brea
We show that Hopfield neural networks with synchronous dynamics and asymmetric weights admit stable orbits that form sequences of maximal length. For [Formula: see text] units, these sequences have length [Formula: see text]; that is, they cover the full state space. We present a mathematical proof that maximal-length orbits exist for all [Formula: see text], and we provide a method to construct both the sequence and the weight matrix that allow its production. The orbit is relatively robust to dynamical noise, and perturbations of the optimal weights reveal other periodic orbits that are not maximal but typically still very long...
February 2017: Neural Computation
https://www.readbyqxmd.com/read/27785935/defense-against-chip-cloning-attacks-based-on-fractional-hopfield-neural-networks
#20
Yi-Fei Pu, Zhang Yi, Ji-Liu Zhou
This paper presents a state-of-the-art application of fractional hopfield neural networks (FHNNs) to defend against chip cloning attacks, and provides insight into the reason that the proposed method is superior to physically unclonable functions (PUFs). In the past decade, PUFs have been evolving as one of the best types of hardware security. However, the development of the PUFs has been somewhat limited by its implementation cost, its temperature variation effect, its electromagnetic interference effect, the amount of entropy in it, etc...
June 2017: International Journal of Neural Systems
keyword
keyword
89515
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"