Read by QxMD icon Read

hopfield network

Mauro Di Marco, Mauro Forti, Luca Pancioni
Recent papers in the literature introduced a class of neural networks (NNs) with memristors, named dynamic-memristor (DM) NNs, such that the analog processing takes place in the charge-flux domain, instead of the typical current-voltage domain as it happens for Hopfield NNs and standard cellular NNs. One key advantage is that, when a steady state is reached, all currents, voltages, and power of a DM-NN drop off, whereas the memristors act as nonvolatile memories that store the processing result. Previous work in the literature addressed multistability of DM-NNs, i...
April 12, 2017: IEEE Transactions on Neural Networks and Learning Systems
Gabriel Baglietto, Guido Gigante, Paolo Del Giudice
Two, partially interwoven, hot topics in the analysis and statistical modeling of neural data, are the development of efficient and informative representations of the time series derived from multiple neural recordings, and the extraction of information about the connectivity structure of the underlying neural network from the recorded neural activities. In the present paper we show that state-space clustering can provide an easy and effective option for reducing the dimensionality of multiple neural time series, that it can improve inference of synaptic couplings from neural activities, and that it can also allow the construction of a compact representation of the multi-dimensional dynamics, that easily lends itself to complexity measures...
2017: PloS One
Marc Mézard
Motivated by recent progress in using restricted Boltzmann machines as preprocessing algorithms for deep neural network, we revisit the mean-field equations [belief-propagation and Thouless-Anderson Palmer (TAP) equations] in the best understood of such machines, namely the Hopfield model of neural networks, and we explicit how they can be used as iterative message-passing algorithms, providing a fast method to compute the local polarizations of neurons. In the "retrieval phase", where neurons polarize in the direction of one memorized pattern, we point out a major difference between the belief propagation and TAP equations: The set of belief propagation equations depends on the pattern which is retrieved, while one can use a unique set of TAP equations...
February 2017: Physical Review. E
Sheng-Jun Wang, Zhou Yang
We study the stability of patterns in Hopfield networks in which a part of memorized patterns are similar. The similarity between patterns impacts the stability of these patterns, but the stability of other independent patterns is only changed slightly. We show that the stability of patterns is affected in different ways by similarity. For networks storing a number of patterns, the similarity between patterns enhances the pattern stability. However, the stability of patterns can be weakened by the similarity when networks store fewer patterns, and the relation between the stability of patterns and similarity is nonmonotonic...
January 2017: Physical Review. E
Masaki Kobayashi
A complex-valued Hopfield neural network (CHNN) is a multistate model of a Hopfield neural network. It has the disadvantage of low noise tolerance. Meanwhile, a symmetric CHNN (SCHNN) is a modification of a CHNN that improves noise tolerance. Furthermore, a rotor Hopfield neural network (RHNN) is an extension of a CHNN. It has twice the storage capacity of CHNNs and SCHNNs, and much better noise tolerance than CHNNs, although it requires twice many connection parameters. In this brief, we investigate the relations between CHNN, SCHNN, and RHNN; an RHNN is uniquely decomposed into a CHNN and SCHNN...
February 2, 2017: IEEE Transactions on Neural Networks and Learning Systems
Naoki Masuyama, Chu Kiong Loo, Manjeevan Seera, Naoyuki Kubota
Quantum-inspired computing is an emerging research area, which has significantly improved the capabilities of conventional algorithms. In general, quantum-inspired hopfield associative memory (QHAM) has demonstrated quantum information processing in neural structures. This has resulted in an exponential increase in storage capacity while explaining the extensive memory, and it has the potential to illustrate the dynamics of neurons in the human brain when viewed from quantum mechanics perspective although the application of QHAM is limited as an autoassociation...
February 6, 2017: IEEE Transactions on Neural Networks and Learning Systems
Yihong Wang, Rubin Wang, Yating Zhu
Rodent animal can accomplish self-locating and path-finding task by forming a cognitive map in the hippocampus representing the environment. In the classical model of the cognitive map, the system (artificial animal) needs large amounts of physical exploration to study spatial environment to solve path-finding problems, which costs too much time and energy. Although Hopfield's mental exploration model makes up for the deficiency mentioned above, the path is still not efficient enough. Moreover, his model mainly focused on the artificial neural network, and clear physiological meanings has not been addressed...
February 2017: Cognitive Neurodynamics
Ying Wan, Jinde Cao, Guanghui Wen
In this paper, the synchronization problem of master-slave chaotic neural networks with remote sensors, quantization process, and communication time delays is investigated. The information communication channel between the master chaotic neural network and slave chaotic neural network consists of several remote sensors, with each sensor able to access only partial knowledge of output information of the master neural network. At each sampling instants, each sensor updates its own measurement and only one sensor is scheduled to transmit its latest information to the controller's side in order to update the control inputs for the slave neural network...
August 24, 2016: IEEE Transactions on Neural Networks and Learning Systems
Masaki Kobayashi
A complex-valued Hopfield neural network (CHNN) is a model of a Hopfield neural network using multistate neurons. The stability conditions of CHNNs have been widely studied. A CHNN with a synchronous mode will converge to a fixed point or a cycle of length 2. A rotor Hopfield neural network (RHNN) is also a model of a multistate Hopfield neural network. RHNNs have much higher storage capacity and noise tolerance than CHNNs. We extend the theories regarding the stability of CHNNs to RHNNs. In addition, we investigate the stability of RHNNs with the projection rule...
December 29, 2016: IEEE Transactions on Neural Networks and Learning Systems
James J Wright, Paul D Bourke
This paper furthers our attempts to resolve two major controversies-whether gamma synchrony plays a role in cognition, and whether cortical columns are functionally important. We have previously argued that the configuration of cortical cells that emerges in development is that which maximizes the magnitude of synchronous oscillation and minimizes metabolic cost. Here we analyze the separate effects in development of minimization of axonal lengths, and of early Hebbian learning and selective distribution of resources to growing synapses, by showing in simulations that these effects are partially antagonistic, but their interaction during development produces accurate anatomical and functional properties for both columnar and non-columnar cortex...
2016: Frontiers in Computational Neuroscience
Jia Wang, Lee-Ming Cheng, Tong Su
Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in GF(p) space. The Diffie--Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography...
November 23, 2016: IEEE Transactions on Neural Networks and Learning Systems
R Manivannan, R Samidurai, Jinde Cao, Ahmed Alsaedi
This paper deals with the problem of delay-interval-dependent stability criteria for switched Hopfield neural networks of neutral type with successive time-varying delay components. A novel Lyapunov-Krasovskii (L-K) functionals with triple integral terms which involves more information on the state vectors of the neural networks and upper bound of the successive time-varying delays is constructed. By using the famous Jensen's inequality, Wirtinger double integral inequality, introducing of some zero equations and using the reciprocal convex combination technique and Finsler's lemma, a novel delay-interval dependent stability criterion is derived in terms of linear matrix inequalities, which can be efficiently solved via standard numerical software...
December 2016: Cognitive Neurodynamics
Samuel P Muscinelli, Wulfram Gerstner, Johanni Brea
We show that Hopfield neural networks with synchronous dynamics and asymmetric weights admit stable orbits that form sequences of maximal length. For [Formula: see text] units, these sequences have length [Formula: see text]; that is, they cover the full state space. We present a mathematical proof that maximal-length orbits exist for all [Formula: see text], and we provide a method to construct both the sequence and the weight matrix that allow its production. The orbit is relatively robust to dynamical noise, and perturbations of the optimal weights reveal other periodic orbits that are not maximal but typically still very long...
February 2017: Neural Computation
Yi-Fei Pu, Zhang Yi, Ji-Liu Zhou
This paper presents a state-of-the-art application of fractional hopfield neural networks (FHNNs) to defend against chip cloning attacks, and provides insight into the reason that the proposed method is superior to physically unclonable functions (PUFs). In the past decade, PUFs have been evolving as one of the best types of hardware security. However, the development of the PUFs has been somewhat limited by its implementation cost, its temperature variation effect, its electromagnetic interference effect, the amount of entropy in it, etc...
June 2017: International Journal of Neural Systems
I Recio, J J Torres
We study emerging phenomena in binary neural networks where, with a probability c synaptic intensities are chosen according with a Hebbian prescription, and with probability (1-c) there is an extra random contribution to synaptic weights. This new term, randomly taken from a Gaussian bimodal distribution, balances the synaptic population in the network so that one has 80%-20% relation in E/I population ratio, mimicking the balance observed in mammals cortex. For some regions of the relevant parameters, our system depicts standard memory (at low temperature) and non-memory attractors (at high temperature)...
December 2016: Neural Networks: the Official Journal of the International Neural Network Society
Dmytro Bielievtsov, Josef Ladenbauer, Klaus Obermayer
We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state...
July 2016: Physical Review. E
Aurélien Decelle, Federico Ricci-Tersenghi
In this work we explain how to properly use mean-field methods to solve the inverse Ising problem when the phase space is clustered, that is, many states are present. The clustering of the phase space can occur for many reasons, e.g., when a system undergoes a phase transition, but also when data are collected in different regimes (e.g., quiescent and spiking regimes in neural networks). Mean-field methods for the inverse Ising problem are typically used without taking into account the eventual clustered structure of the input configurations and may lead to very poor inference (e...
July 2016: Physical Review. E
Emre O Neftci, Bruno U Pedroni, Siddharth Joshi, Maruan Al-Shedivat, Gert Cauwenberghs
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections...
2016: Frontiers in Neuroscience
Yi-Fei Pu, Zhang Yi, Ji-Liu Zhou
This paper mainly discusses a novel conceptual framework: fractional Hopfield neural networks (FHNN). As is commonly known, fractional calculus has been incorporated into artificial neural networks, mainly because of its long-term memory and nonlocality. Some researchers have made interesting attempts at fractional neural networks and gained competitive advantages over integer-order neural networks. Therefore, it is naturally makes one ponder how to generalize the first-order Hopfield neural networks to the fractional-order ones, and how to implement FHNN by means of fractional calculus...
July 14, 2016: IEEE Transactions on Neural Networks and Learning Systems
Janis Klaise, Samuel Johnson
Trophic coherence, a measure of the extent to which the nodes of a directed network are organised in levels, has recently been shown to be closely related to many structural and dynamical aspects of complex systems, including graph eigenspectra, the prevalence or absence of feedback cycles, and linear stability. Furthermore, non-trivial trophic structures have been observed in networks of neurons, species, genes, metabolites, cellular signalling, concatenated words, P2P users, and world trade. Here, we consider two simple yet apparently quite different dynamical models-one a susceptible-infected-susceptible epidemic model adapted to include complex contagion and the other an Amari-Hopfield neural network-and show that in both cases the related spreading processes are modulated in similar ways by the trophic coherence of the underlying networks...
June 2016: Chaos
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"