keyword
MENU ▼
Read by QxMD icon Read
search

hopfield network

keyword
https://www.readbyqxmd.com/read/27893400/multivariate-cryptography-based-on-clipped-hopfield-neural-network
#1
Jia Wang, Lee-Ming Cheng, Tong Su
Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in GF(p) space. The Diffie--Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography...
November 23, 2016: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/27891202/new-delay-interval-dependent-stability-criteria-for-switched-hopfield-neural-networks-of-neutral-type-with-successive-time-varying-delay-components
#2
R Manivannan, R Samidurai, Jinde Cao, Ahmed Alsaedi
This paper deals with the problem of delay-interval-dependent stability criteria for switched Hopfield neural networks of neutral type with successive time-varying delay components. A novel Lyapunov-Krasovskii (L-K) functionals with triple integral terms which involves more information on the state vectors of the neural networks and upper bound of the successive time-varying delays is constructed. By using the famous Jensen's inequality, Wirtinger double integral inequality, introducing of some zero equations and using the reciprocal convex combination technique and Finsler's lemma, a novel delay-interval dependent stability criterion is derived in terms of linear matrix inequalities, which can be efficiently solved via standard numerical software...
December 2016: Cognitive Neurodynamics
https://www.readbyqxmd.com/read/27870611/exponentially-long-orbits-in-hopfield-neural-networks
#3
Samuel P Muscinelli, Wulfram Gerstner, Johanni Brea
We show that Hopfield neural networks with synchronous dynamics and asymmetric weights admit stable orbits that form sequences of maximal length. For [Formula: see text] units, these sequences have length [Formula: see text]; that is, they cover the full state-space. We present a mathematical proof that maximal-length orbits exist for all [Formula: see text], and we provide a method to construct both the sequence and the weight matrix that allow its production. The orbit is relatively robust to dynamical noise, and perturbations of the optimal weights reveal other periodic orbits that are not maximal but typically still very long...
November 21, 2016: Neural Computation
https://www.readbyqxmd.com/read/27785935/defense-against-chip-cloning-attacks-based-on-fractional-hopfield-neural-networks
#4
Yi-Fei Pu, Zhang Yi, Ji-Liu Zhou
This paper presents a state-of-the-art application of fractional hopfield neural networks (FHNNs) to defend against chip cloning attacks, and provides insight into the reason that the proposed method is superior to physically unclonable functions (PUFs). In the past decade, PUFs have been evolving as one of the best types of hardware security. However, the development of the PUFs has been somewhat limited by its implementation cost, its temperature variation effect, its electromagnetic interference effect, the amount of entropy in it, etc...
September 9, 2016: International Journal of Neural Systems
https://www.readbyqxmd.com/read/27721205/emergence-of-low-noise-frustrated-states-in-e-i-balanced-neural-networks
#5
I Recio, J J Torres
We study emerging phenomena in binary neural networks where, with a probability c synaptic intensities are chosen according with a Hebbian prescription, and with probability (1-c) there is an extra random contribution to synaptic weights. This new term, randomly taken from a Gaussian bimodal distribution, balances the synaptic population in the network so that one has 80%-20% relation in E/I population ratio, mimicking the balance observed in mammals cortex. For some regions of the relevant parameters, our system depicts standard memory (at low temperature) and non-memory attractors (at high temperature)...
September 8, 2016: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/27575147/controlling-statistical-moments-of-stochastic-dynamical-networks
#6
Dmytro Bielievtsov, Josef Ladenbauer, Klaus Obermayer
We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state...
July 2016: Physical Review. E
https://www.readbyqxmd.com/read/27575082/solving-the-inverse-ising-problem-by-mean-field-methods-in-a-clustered-phase-space-with-many-states
#7
Aurélien Decelle, Federico Ricci-Tersenghi
In this work we explain how to properly use mean-field methods to solve the inverse Ising problem when the phase space is clustered, that is, many states are present. The clustering of the phase space can occur for many reasons, e.g., when a system undergoes a phase transition, but also when data are collected in different regimes (e.g., quiescent and spiking regimes in neural networks). Mean-field methods for the inverse Ising problem are typically used without taking into account the eventual clustered structure of the input configurations and may lead to very poor inference (e...
July 2016: Physical Review. E
https://www.readbyqxmd.com/read/27445650/stochastic-synapses-enable-efficient-brain-inspired-learning-machines
#8
Emre O Neftci, Bruno U Pedroni, Siddharth Joshi, Maruan Al-Shedivat, Gert Cauwenberghs
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections...
2016: Frontiers in Neuroscience
https://www.readbyqxmd.com/read/27429451/fractional-hopfield-neural-networks-fractional-dynamic-associative-recurrent-neural-networks
#9
Yi-Fei Pu, Zhang Yi, Ji-Liu Zhou
This paper mainly discusses a novel conceptual framework: fractional Hopfield neural networks (FHNN). As is commonly known, fractional calculus has been incorporated into artificial neural networks, mainly because of its long-term memory and nonlocality. Some researchers have made interesting attempts at fractional neural networks and gained competitive advantages over integer-order neural networks. Therefore, it is naturally makes one ponder how to generalize the first-order Hopfield neural networks to the fractional-order ones, and how to implement FHNN by means of fractional calculus...
July 14, 2016: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/27368799/from-neurons-to-epidemics-how-trophic-coherence-affects-spreading-processes
#10
Janis Klaise, Samuel Johnson
Trophic coherence, a measure of the extent to which the nodes of a directed network are organised in levels, has recently been shown to be closely related to many structural and dynamical aspects of complex systems, including graph eigenspectra, the prevalence or absence of feedback cycles, and linear stability. Furthermore, non-trivial trophic structures have been observed in networks of neurons, species, genes, metabolites, cellular signalling, concatenated words, P2P users, and world trade. Here, we consider two simple yet apparently quite different dynamical models-one a susceptible-infected-susceptible epidemic model adapted to include complex contagion and the other an Amari-Hopfield neural network-and show that in both cases the related spreading processes are modulated in similar ways by the trophic coherence of the underlying networks...
June 2016: Chaos
https://www.readbyqxmd.com/read/27300910/memory-recall-and-spike-frequency-adaptation
#11
James P Roach, Leonard M Sander, Michal R Zochowski
The brain can reproduce memories from partial data; this ability is critical for memory recall. The process of memory recall has been studied using autoassociative networks such as the Hopfield model. This kind of model reliably converges to stored patterns that contain the memory. However, it is unclear how the behavior is controlled by the brain so that after convergence to one configuration, it can proceed with recognition of another one. In the Hopfield model, this happens only through unrealistic changes of an effective global temperature that destabilizes all stored configurations...
May 2016: Physical Review. E
https://www.readbyqxmd.com/read/27114266/building-a-minimum-frustration-framework-for-brain-functions-over-long-time-scales
#12
REVIEW
Arturo Tozzi, Tor Flå, James F Peters
The minimum frustration principle (MFP) is a computational approach stating that, over the long time scales of evolution, proteins' free energy decreases more than expected by thermodynamical constraints as their amino acids assume conformations progressively closer to the lowest energetic state. This Review shows that this general principle, borrowed from protein folding dynamics, can also be fruitfully applied to nervous function. Highlighting the foremost role of energetic requirements, macromolecular dynamics, and above all intertwined time scales in brain activity, the MFP elucidates a wide range of mental processes from sensations to memory retrieval...
August 2016: Journal of Neuroscience Research
https://www.readbyqxmd.com/read/26907568/basin-stability-in-delayed-dynamics
#13
Siyang Leng, Wei Lin, Jürgen Kurths
Basin stability (BS) is a universal concept for complex systems studies, which focuses on the volume of the basin of attraction instead of the traditional linearization-based approach. It has a lot of applications in real-world systems especially in dynamical systems with a phenomenon of multi-stability, which is even more ubiquitous in delayed dynamics such as the firing neurons, the climatological processes, and the power grids. Due to the infinite dimensional property of the space for the initial values, how to properly define the basin's volume for delayed dynamics remains a fundamental problem...
2016: Scientific Reports
https://www.readbyqxmd.com/read/26849875/symmetric-complex-valued-hopfield-neural-networks
#14
Masaki Kobayashi
Complex-valued neural networks, which are extensions of ordinary neural networks, have been studied as interesting models by many researchers. Especially, complex-valued Hopfield neural networks (CHNNs) have been used to process multilevel data, such as gray-scale images. CHNNs with Hermitian connection weights always converge using asynchronous update. The noise tolerance of CHNNs deteriorates extremely as the resolution increases. Noise tolerance is one of the most controversial problems for CHNNs. It is known that rotational invariance reduces noise tolerance...
January 28, 2016: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/26732664/modeling-and-experimental-demonstration-of-a-hopfield-network-analog-to-digital-converter-with-hybrid-cmos-memristor-circuits
#15
Xinjie Guo, Farnood Merrikh-Bayat, Ligang Gao, Brian D Hoskins, Fabien Alibart, Bernabe Linares-Barranco, Luke Theogarajan, Christof Teuscher, Dmitri B Strukov
The purpose of this work was to demonstrate the feasibility of building recurrent artificial neural networks with hybrid complementary metal oxide semiconductor (CMOS)/memristor circuits. To do so, we modeled a Hopfield network implementing an analog-to-digital converter (ADC) with up to 8 bits of precision. Major shortcomings affecting the ADC's precision, such as the non-ideal behavior of CMOS circuitry and the specific limitations of memristors, were investigated and an effective solution was proposed, capitalizing on the in-field programmability of memristors...
2015: Frontiers in Neuroscience
https://www.readbyqxmd.com/read/26732491/neural-network-model-of-memory-retrieval
#16
Stefano Recanatesi, Mikhail Katkov, Sandro Romani, Misha Tsodyks
Human memory can store large amount of information. Nevertheless, recalling is often a challenging task. In a classical free recall paradigm, where participants are asked to repeat a briefly presented list of words, people make mistakes for lists as short as 5 words. We present a model for memory retrieval based on a Hopfield neural network where transition between items are determined by similarities in their long-term memory representations. Meanfield analysis of the model reveals stable states of the network corresponding (1) to single memory representations and (2) intersection between memory representations...
2015: Frontiers in Computational Neuroscience
https://www.readbyqxmd.com/read/26729148/a-modified-hopfield-neural-network-algorithm-mhnna-using-alos-image-for-water-quality-mapping
#17
Ahmed Asal Kzar, Mohd Zubir Mat Jafri, Kussay N Mutter, Saumi Syahreza
Decreasing water pollution is a big problem in coastal waters. Coastal health of ecosystems can be affected by high concentrations of suspended sediment. In this work, a Modified Hopfield Neural Network Algorithm (MHNNA) was used with remote sensing imagery to classify the total suspended solids (TSS) concentrations in the waters of coastal Langkawi Island, Malaysia. The adopted remote sensing image is the Advanced Land Observation Satellite (ALOS) image acquired on 18 January 2010. Our modification allows the Hopfield neural network to convert and classify color satellite images...
January 2016: International Journal of Environmental Research and Public Health
https://www.readbyqxmd.com/read/26709852/multistability-in-large-scale-models-of-brain-activity
#18
Mathieu Golos, Viktor Jirsa, Emmanuel Daucé
Noise driven exploration of a brain network's dynamic repertoire has been hypothesized to be causally involved in cognitive function, aging and neurodegeneration. The dynamic repertoire crucially depends on the network's capacity to store patterns, as well as their stability. Here we systematically explore the capacity of networks derived from human connectomes to store attractor states, as well as various network mechanisms to control the brain's dynamic repertoire. Using a deterministic graded response Hopfield model with connectome-based interactions, we reconstruct the system's attractor space through a uniform sampling of the initial conditions...
December 2015: PLoS Computational Biology
https://www.readbyqxmd.com/read/26599711/efficient-associative-computation-with-discrete-synapses
#19
Andreas Knoblauch
Neural associative networks are a promising computational paradigm for both modeling neural circuits of the brain and implementing associative memory and Hebbian cell assemblies in parallel VLSI or nanoscale hardware. Previous work has extensively investigated synaptic learning in linear models of the Hopfield type and simple nonlinear models of the Steinbuch/Willshaw type. Optimized Hopfield networks of size n can store a large number of about n(2)/k memories of size k (or associations between them) but require real-valued synapses, which are expensive to implement and can store at most C = 0...
January 2016: Neural Computation
https://www.readbyqxmd.com/read/26595931/a-generalized-hopfield-network-for-nonsmooth-constrained-convex-optimization-lie-derivative-approach
#20
Chaojie Li, Xinghuo Yu, Tingwen Huang, Guo Chen, Xing He
This paper proposes a generalized Hopfield network for solving general constrained convex optimization problems. First, the existence and the uniqueness of solutions to the generalized Hopfield network in the Filippov sense are proved. Then, the Lie derivative is introduced to analyze the stability of the network using a differential inclusion. The optimality of the solution to the nonsmooth constrained optimization problems is shown to be guaranteed by the enhanced Fritz John conditions. The convergence rate of the generalized Hopfield network can be estimated by the second-order derivative of the energy function...
February 2016: IEEE Transactions on Neural Networks and Learning Systems
keyword
keyword
89515
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"