Read by QxMD icon Read

hopfield network

Christopher J Hillar, Ngoc M Tran
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size...
January 16, 2018: Journal of Mathematical Neuroscience
Călin-Adrian Popa, Eva Kaslik
The existence of multiple exponentially stable equilibrium states and periodic solutions is investigated for Hopfield-type quaternion-valued neural networks (QVNNs) with impulsive effects and both time-dependent and distributed delays. Employing Brouwer's and Leray-Schauder's fixed point theorems, suitable Lyapunov functionals and impulsive control theory, sufficient conditions are given for the existence of 16n attractors, showing a substantial improvement in storage capacity, compared to real-valued or complex-valued neural networks...
December 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
Oliver L C Rourke, Daniel A Butts
The ability of sensory networks to transiently store information on the scale of seconds can confer many advantages in processing time-varying stimuli. How a network could store information on such intermediate time scales, between typical neurophysiological time scales and those of long-term memory, is typically attributed to persistent neural activity. An alternative mechanism which might allow for such information storage is through temporary modifications to the neural connectivity which decay on the same second-long time scale as the underlying memories...
2017: PloS One
Melanie Weber, Pedro D Maia, J Nathan Kutz
Neurodegenerative diseases and traumatic brain injuries (TBI) are among the main causes of cognitive dysfunction in humans. At a neuronal network level, they both extensively exhibit focal axonal swellings (FAS), which in turn, compromise the information encoded in spike trains and lead to potentially severe functional deficits. There are currently no satisfactory quantitative predictors of decline in memory-encoding neuronal networks based on the impact and statistics of FAS. Some of the challenges of this translational approach include our inability to access small scale injuries with non-invasive methods, the overall complexity of neuronal pathologies, and our limited knowledge of how networks process biological signals...
2017: Frontiers in Neuroscience
Anthony Szedlak, Spencer Sims, Nicholas Smith, Giovanni Paternostro, Carlo Piermarocchi
Modern time series gene expression and other omics data sets have enabled unprecedented resolution of the dynamics of cellular processes such as cell cycle and response to pharmaceutical compounds. In anticipation of the proliferation of time series data sets in the near future, we use the Hopfield model, a recurrent neural network based on spin glasses, to model the dynamics of cell cycle in HeLa (human cervical cancer) and S. cerevisiae cells. We study some of the rich dynamical properties of these cyclic Hopfield systems, including the ability of populations of simulated cells to recreate experimental expression data and the effects of noise on the dynamics...
November 17, 2017: PLoS Computational Biology
Do-Hyun Kim, Jinha Park, Byungnam Kahng
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution...
2017: PloS One
Jing Guo, Jie Zheng
Motivation: The interpretation of transcriptional dynamics in single-cell data, especially pseudotime estimation, could help understand the transition of gene expression profiles. The recovery of pseudotime increases the temporal resolution of single-cell transcriptional data, but is challenging due to the high variability in gene expression between individual cells. Here, we introduce HopLand, a pseudotime recovery method using continuous Hopfield network to map cells to a Waddington's epigenetic landscape...
July 15, 2017: Bioinformatics
Bocheng Bao, Hui Qian, Quan Xu, Mo Chen, Jiang Wang, Yajuan Yu
A new hyperbolic-type memristor emulator is presented and its frequency-dependent pinched hysteresis loops are analyzed by numerical simulations and confirmed by hardware experiments. Based on the emulator, a novel hyperbolic-type memristor based 3-neuron Hopfield neural network (HNN) is proposed, which is achieved through substituting one coupling-connection weight with a memristive synaptic weight. It is numerically shown that the memristive HNN has a dynamical transition from chaotic, to periodic, and further to stable point behaviors with the variations of the memristor inner parameter, implying the stabilization effect of the hyperbolic-type memristor on the chaotic HNN...
2017: Frontiers in Computational Neuroscience
Carlos Aguilar, Pascal Chossat, Martin Krupa, Frédéric Lavigne
Prediction is the ability of the brain to quickly activate a target concept in response to a related stimulus (prime). Experiments point to the existence of an overlap between the populations of the neurons coding for different stimuli, and other experiments show that prime-target relations arise in the process of long term memory formation. The classical modelling paradigm is that long term memories correspond to stable steady states of a Hopfield network with Hebbian connectivity. Experiments show that short term synaptic depression plays an important role in the processing of memories...
2017: PloS One
Suhas Kumar, John Paul Strachan, R Stanley Williams
At present, machine learning systems use simplified neuron models that lack the rich nonlinear phenomena observed in biological systems, which display spatio-temporal cooperative dynamics. There is evidence that neurons operate in a regime called the edge of chaos that may be central to complexity, learning efficiency, adaptability and analogue (non-Boolean) computation in brains. Neural networks have exhibited enhanced computational complexity when operated at the edge of chaos, and networks of chaotic elements have been proposed for solving combinatorial or global optimization problems...
August 17, 2017: Nature
Nuru Adgaba, Ahmed Alghamdi, Rachid Sammoud, Awraris Shenkute, Yilma Tadesse, Mahammad J Ansari, Deepak Sharma, Colleen Hepburn
In arid zones, the shortage of bee forage is critical and usually compels beekeepers to move their colonies in search of better forages. Identifying and mapping the spatiotemporal distribution of the bee forages over given area is important for better management of bee colonies. In this study honey bee plants in the target areas were inventoried following, ground inventory work supported with GIS applications. The study was conducted on 85 large plots of 50 × 50 m each. At each plot, data on species name, height, base diameter, crown height, crown diameter has been taken for each plant with their respective geographical positions...
July 2017: Saudi Journal of Biological Sciences
Xianxiu Zhang, Chuandong Li, Tingwen Huang
We discuss the global stability of switching Hopfield neural networks (HNN) with state-dependent impulses using B-equivalence method. Under certain conditions, we show that the state-dependent impulsive switching systems can be reduced to the fixed-time ones, and that the global stability of corresponding comparison system implies the same stability of the considered system. On this basis, a novel stability criterion for the considered HNN is established. Finally, two numerical examples are given to demonstrate the effectiveness of our results...
September 2017: Neural Networks: the Official Journal of the International Neural Network Society
Nelly I Zhokhova, Igor I Baskin
In Energy-Based Neural Networks (EBNNs), relationships between variables are captured by means of a scalar function conventionally called "energy". In this article, we introduce a procedure of "harmony search", which looks for compounds providing the lowest energies for the EBNNs trained on active compounds. It can be considered as a special kind of similarity search that takes into account regularities in the structures of active compounds. In this paper, we show that harmony search can be used for performing virtual screening...
June 19, 2017: Molecular Informatics
Masaki Kobayashi
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima...
2017: Computational Intelligence and Neuroscience
Chris Gorman, Anthony Robins, Alistair Knott
We present an investigation of the potential use of Hopfield networks to learn neurally plausible, distributed representations of category prototypes. Hopfield networks are dynamical models of autoassociative memory which learn to recreate a set of input states from any given starting state. These networks, however, will almost always learn states which were not presented during training, so called spurious states. Historically, spurious states have been an undesirable side-effect of training a Hopfield network and there has been much research into detecting and discarding these unwanted states...
April 25, 2017: Neural Networks: the Official Journal of the International Neural Network Society
Marcos Eduardo Valle, Fidelis Zanetti de Castro
In this paper, we first address the dynamics of the elegant multivalued quaternionic Hopfield neural network (MV-QHNN) proposed by Minemoto et al. Contrary to what was expected, we show that the MV-QHNN, as well as one of its variation, does not always come to rest at an equilibrium state under the usual conditions. In fact, we provide simple examples in which the network yields a periodic sequence of quaternionic state vectors. Afterward, we turn our attention to the continuous-valued quaternionic Hopfield neural network (CV-QHNN), which can be derived from the MV-QHNN by means of a limit process...
May 5, 2017: IEEE Transactions on Neural Networks and Learning Systems
Atefeh Taherian Fard, Mark A Ragan
Genome-wide regulatory networks enable cells to function, develop, and survive. Perturbation of these networks can lead to appearance of a disease phenotype. Inspired by Conrad Waddington's epigenetic landscape of cell development, we use a Hopfield network formalism to construct an attractor landscape model of disease progression based on protein- or gene-correlation networks of Parkinson's disease, glioma, and colorectal cancer. Attractors in this landscape correspond to normal and disease states of the cell...
2017: Frontiers in Genetics
Mauro Di Marco, Mauro Forti, Luca Pancioni
Recent papers in the literature introduced a class of neural networks (NNs) with memristors, named dynamic-memristor (DM) NNs, such that the analog processing takes place in the charge-flux domain, instead of the typical current-voltage domain as it happens for Hopfield NNs and standard cellular NNs. One key advantage is that, when a steady state is reached, all currents, voltages, and power of a DM-NN drop off, whereas the memristors act as nonvolatile memories that store the processing result. Previous work in the literature addressed multistability of DM-NNs, i...
April 12, 2017: IEEE Transactions on Neural Networks and Learning Systems
Gabriel Baglietto, Guido Gigante, Paolo Del Giudice
Two, partially interwoven, hot topics in the analysis and statistical modeling of neural data, are the development of efficient and informative representations of the time series derived from multiple neural recordings, and the extraction of information about the connectivity structure of the underlying neural network from the recorded neural activities. In the present paper we show that state-space clustering can provide an easy and effective option for reducing the dimensionality of multiple neural time series, that it can improve inference of synaptic couplings from neural activities, and that it can also allow the construction of a compact representation of the multi-dimensional dynamics, that easily lends itself to complexity measures...
2017: PloS One
Marc Mézard
Motivated by recent progress in using restricted Boltzmann machines as preprocessing algorithms for deep neural network, we revisit the mean-field equations [belief-propagation and Thouless-Anderson Palmer (TAP) equations] in the best understood of such machines, namely the Hopfield model of neural networks, and we explicit how they can be used as iterative message-passing algorithms, providing a fast method to compute the local polarizations of neurons. In the "retrieval phase", where neurons polarize in the direction of one memorized pattern, we point out a major difference between the belief propagation and TAP equations: The set of belief propagation equations depends on the pattern which is retrieved, while one can use a unique set of TAP equations...
February 2017: Physical Review. E
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"