Read by QxMD icon Read

hopfield network

I Recio, J J Torres
We study emerging phenomena in binary neural networks where, with a probability c synaptic intensities are chosen according with a Hebbian prescription, and with probability (1-c) there is an extra random contribution to synaptic weights. This new term, randomly taken from a Gaussian bimodal distribution, balances the synaptic population in the network so that one has 80%-20% relation in E/I population ratio, mimicking the balance observed in mammals cortex. For some regions of the relevant parameters, our system depicts standard memory (at low temperature) and non-memory attractors (at high temperature)...
September 8, 2016: Neural Networks: the Official Journal of the International Neural Network Society
Dmytro Bielievtsov, Josef Ladenbauer, Klaus Obermayer
We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state...
July 2016: Physical Review. E
Aurélien Decelle, Federico Ricci-Tersenghi
In this work we explain how to properly use mean-field methods to solve the inverse Ising problem when the phase space is clustered, that is, many states are present. The clustering of the phase space can occur for many reasons, e.g., when a system undergoes a phase transition, but also when data are collected in different regimes (e.g., quiescent and spiking regimes in neural networks). Mean-field methods for the inverse Ising problem are typically used without taking into account the eventual clustered structure of the input configurations and may lead to very poor inference (e...
July 2016: Physical Review. E
Emre O Neftci, Bruno U Pedroni, Siddharth Joshi, Maruan Al-Shedivat, Gert Cauwenberghs
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections...
2016: Frontiers in Neuroscience
Yi-Fei Pu, Zhang Yi, Ji-Liu Zhou
This paper mainly discusses a novel conceptual framework: fractional Hopfield neural networks (FHNN). As is commonly known, fractional calculus has been incorporated into artificial neural networks, mainly because of its long-term memory and nonlocality. Some researchers have made interesting attempts at fractional neural networks and gained competitive advantages over integer-order neural networks. Therefore, it is naturally makes one ponder how to generalize the first-order Hopfield neural networks to the fractional-order ones, and how to implement FHNN by means of fractional calculus...
July 14, 2016: IEEE Transactions on Neural Networks and Learning Systems
Janis Klaise, Samuel Johnson
Trophic coherence, a measure of the extent to which the nodes of a directed network are organised in levels, has recently been shown to be closely related to many structural and dynamical aspects of complex systems, including graph eigenspectra, the prevalence or absence of feedback cycles, and linear stability. Furthermore, non-trivial trophic structures have been observed in networks of neurons, species, genes, metabolites, cellular signalling, concatenated words, P2P users, and world trade. Here, we consider two simple yet apparently quite different dynamical models-one a susceptible-infected-susceptible epidemic model adapted to include complex contagion and the other an Amari-Hopfield neural network-and show that in both cases the related spreading processes are modulated in similar ways by the trophic coherence of the underlying networks...
June 2016: Chaos
James P Roach, Leonard M Sander, Michal R Zochowski
The brain can reproduce memories from partial data; this ability is critical for memory recall. The process of memory recall has been studied using autoassociative networks such as the Hopfield model. This kind of model reliably converges to stored patterns that contain the memory. However, it is unclear how the behavior is controlled by the brain so that after convergence to one configuration, it can proceed with recognition of another one. In the Hopfield model, this happens only through unrealistic changes of an effective global temperature that destabilizes all stored configurations...
May 2016: Physical Review. E
Arturo Tozzi, Tor Flå, James F Peters
The minimum frustration principle (MFP) is a computational approach stating that, over the long time scales of evolution, proteins' free energy decreases more than expected by thermodynamical constraints as their amino acids assume conformations progressively closer to the lowest energetic state. This Review shows that this general principle, borrowed from protein folding dynamics, can also be fruitfully applied to nervous function. Highlighting the foremost role of energetic requirements, macromolecular dynamics, and above all intertwined time scales in brain activity, the MFP elucidates a wide range of mental processes from sensations to memory retrieval...
August 2016: Journal of Neuroscience Research
Siyang Leng, Wei Lin, Jürgen Kurths
Basin stability (BS) is a universal concept for complex systems studies, which focuses on the volume of the basin of attraction instead of the traditional linearization-based approach. It has a lot of applications in real-world systems especially in dynamical systems with a phenomenon of multi-stability, which is even more ubiquitous in delayed dynamics such as the firing neurons, the climatological processes, and the power grids. Due to the infinite dimensional property of the space for the initial values, how to properly define the basin's volume for delayed dynamics remains a fundamental problem...
2016: Scientific Reports
Masaki Kobayashi
Complex-valued neural networks, which are extensions of ordinary neural networks, have been studied as interesting models by many researchers. Especially, complex-valued Hopfield neural networks (CHNNs) have been used to process multilevel data, such as gray-scale images. CHNNs with Hermitian connection weights always converge using asynchronous update. The noise tolerance of CHNNs deteriorates extremely as the resolution increases. Noise tolerance is one of the most controversial problems for CHNNs. It is known that rotational invariance reduces noise tolerance...
January 28, 2016: IEEE Transactions on Neural Networks and Learning Systems
Xinjie Guo, Farnood Merrikh-Bayat, Ligang Gao, Brian D Hoskins, Fabien Alibart, Bernabe Linares-Barranco, Luke Theogarajan, Christof Teuscher, Dmitri B Strukov
The purpose of this work was to demonstrate the feasibility of building recurrent artificial neural networks with hybrid complementary metal oxide semiconductor (CMOS)/memristor circuits. To do so, we modeled a Hopfield network implementing an analog-to-digital converter (ADC) with up to 8 bits of precision. Major shortcomings affecting the ADC's precision, such as the non-ideal behavior of CMOS circuitry and the specific limitations of memristors, were investigated and an effective solution was proposed, capitalizing on the in-field programmability of memristors...
2015: Frontiers in Neuroscience
Stefano Recanatesi, Mikhail Katkov, Sandro Romani, Misha Tsodyks
Human memory can store large amount of information. Nevertheless, recalling is often a challenging task. In a classical free recall paradigm, where participants are asked to repeat a briefly presented list of words, people make mistakes for lists as short as 5 words. We present a model for memory retrieval based on a Hopfield neural network where transition between items are determined by similarities in their long-term memory representations. Meanfield analysis of the model reveals stable states of the network corresponding (1) to single memory representations and (2) intersection between memory representations...
2015: Frontiers in Computational Neuroscience
Ahmed Asal Kzar, Mohd Zubir Mat Jafri, Kussay N Mutter, Saumi Syahreza
Decreasing water pollution is a big problem in coastal waters. Coastal health of ecosystems can be affected by high concentrations of suspended sediment. In this work, a Modified Hopfield Neural Network Algorithm (MHNNA) was used with remote sensing imagery to classify the total suspended solids (TSS) concentrations in the waters of coastal Langkawi Island, Malaysia. The adopted remote sensing image is the Advanced Land Observation Satellite (ALOS) image acquired on 18 January 2010. Our modification allows the Hopfield neural network to convert and classify color satellite images...
January 2016: International Journal of Environmental Research and Public Health
Mathieu Golos, Viktor Jirsa, Emmanuel Daucé
Noise driven exploration of a brain network's dynamic repertoire has been hypothesized to be causally involved in cognitive function, aging and neurodegeneration. The dynamic repertoire crucially depends on the network's capacity to store patterns, as well as their stability. Here we systematically explore the capacity of networks derived from human connectomes to store attractor states, as well as various network mechanisms to control the brain's dynamic repertoire. Using a deterministic graded response Hopfield model with connectome-based interactions, we reconstruct the system's attractor space through a uniform sampling of the initial conditions...
December 2015: PLoS Computational Biology
Andreas Knoblauch
Neural associative networks are a promising computational paradigm for both modeling neural circuits of the brain and implementing associative memory and Hebbian cell assemblies in parallel VLSI or nanoscale hardware. Previous work has extensively investigated synaptic learning in linear models of the Hopfield type and simple nonlinear models of the Steinbuch/Willshaw type. Optimized Hopfield networks of size n can store a large number of about n(2)/k memories of size k (or associations between them) but require real-valued synapses, which are expensive to implement and can store at most C = 0...
January 2016: Neural Computation
Chaojie Li, Xinghuo Yu, Tingwen Huang, Guo Chen, Xing He
This paper proposes a generalized Hopfield network for solving general constrained convex optimization problems. First, the existence and the uniqueness of solutions to the generalized Hopfield network in the Filippov sense are proved. Then, the Lie derivative is introduced to analyze the stability of the network using a differential inclusion. The optimality of the solution to the nonsmooth constrained optimization problems is shown to be guaranteed by the enhanced Fritz John conditions. The convergence rate of the generalized Hopfield network can be estimated by the second-order derivative of the energy function...
February 2016: IEEE Transactions on Neural Networks and Learning Systems
David F Nichols
Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course...
2015: Journal of Undergraduate Neuroscience Education: JUNE: a Publication of FUN, Faculty for Undergraduate Neuroscience
Ren Zheng, Xinlei Yi, Wenlian Lu, Tianping Chen
In this paper, we investigate stability of a class of analytic neural networks with the synaptic feedback via event-triggered rules. This model is general and include Hopfield neural network as a special case. These event-trigger rules can efficiently reduces loads of computation and information transmission at synapses of the neurons. The synaptic feedback of each neuron keeps a constant value based on the outputs of the other neurons at its latest triggering time but changes at its next triggering time, which is determined by a certain criterion...
February 2016: IEEE Transactions on Neural Networks and Learning Systems
Qi Wang, Rong Chen, Joseph JaJa, Yu Jin, L Elliot Hong, Edward H Herskovits
Defining brain structures of interest is an important preliminary step in brain-connectivity analysis. Researchers interested in connectivity patterns among brain structures typically employ manually delineated volumes of interest, or regions in a readily available atlas, to limit the scope of connectivity analysis to relevant regions. However, most structural brain atlases, and manually delineated volumes of interest, do not take voxel-wise connectivity patterns into consideration, and therefore may not be ideal for anatomic connectivity analysis...
January 2016: Neuroinformatics
John J Hopfield
In higher animals, complex and robust behaviors are produced by the microscopic details of large structured ensembles of neurons. I describe how the emergent computational dynamics of a biologically based neural network generates a robust natural solution to the problem of categorizing time-varying stimulus patterns such as spoken words or animal stereotypical behaviors. The recognition of these patterns is made difficult by their substantial variation in cadence and duration. The neural circuit behaviors used are similar to those associated with brain neural integrators...
October 2015: Neural Computation
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"