Read by QxMD icon Read

hopfield network

Jing Guo, Jie Zheng
Motivation: The interpretation of transcriptional dynamics in single-cell data, especially pseudotime estimation, could help understand the transition of gene expression profiles. The recovery of pseudotime increases the temporal resolution of single-cell transcriptional data, but is challenging due to the high variability in gene expression between individual cells. Here, we introduce HopLand, a pseudotime recovery method using continuous Hopfield network to map cells to a Waddington's epigenetic landscape...
July 15, 2017: Bioinformatics
Bocheng Bao, Hui Qian, Quan Xu, Mo Chen, Jiang Wang, Yajuan Yu
A new hyperbolic-type memristor emulator is presented and its frequency-dependent pinched hysteresis loops are analyzed by numerical simulations and confirmed by hardware experiments. Based on the emulator, a novel hyperbolic-type memristor based 3-neuron Hopfield neural network (HNN) is proposed, which is achieved through substituting one coupling-connection weight with a memristive synaptic weight. It is numerically shown that the memristive HNN has a dynamical transition from chaotic, to periodic, and further to stable point behaviors with the variations of the memristor inner parameter, implying the stabilization effect of the hyperbolic-type memristor on the chaotic HNN...
2017: Frontiers in Computational Neuroscience
Carlos Aguilar, Pascal Chossat, Martin Krupa, Frédéric Lavigne
Prediction is the ability of the brain to quickly activate a target concept in response to a related stimulus (prime). Experiments point to the existence of an overlap between the populations of the neurons coding for different stimuli, and other experiments show that prime-target relations arise in the process of long term memory formation. The classical modelling paradigm is that long term memories correspond to stable steady states of a Hopfield network with Hebbian connectivity. Experiments show that short term synaptic depression plays an important role in the processing of memories...
2017: PloS One
Suhas Kumar, John Paul Strachan, R Stanley Williams
At present, machine learning systems use simplified neuron models that lack the rich nonlinear phenomena observed in biological systems, which display spatio-temporal cooperative dynamics. There is evidence that neurons operate in a regime called the edge of chaos that may be central to complexity, learning efficiency, adaptability and analogue (non-Boolean) computation in brains. Neural networks have exhibited enhanced computational complexity when operated at the edge of chaos, and networks of chaotic elements have been proposed for solving combinatorial or global optimization problems...
August 17, 2017: Nature
Nuru Adgaba, Ahmed Alghamdi, Rachid Sammoud, Awraris Shenkute, Yilma Tadesse, Mahammad J Ansari, Deepak Sharma, Colleen Hepburn
In arid zones, the shortage of bee forage is critical and usually compels beekeepers to move their colonies in search of better forages. Identifying and mapping the spatiotemporal distribution of the bee forages over given area is important for better management of bee colonies. In this study honey bee plants in the target areas were inventoried following, ground inventory work supported with GIS applications. The study was conducted on 85 large plots of 50 × 50 m each. At each plot, data on species name, height, base diameter, crown height, crown diameter has been taken for each plant with their respective geographical positions...
July 2017: Saudi Journal of Biological Sciences
Xianxiu Zhang, Chuandong Li, Tingwen Huang
We discuss the global stability of switching Hopfield neural networks (HNN) with state-dependent impulses using B-equivalence method. Under certain conditions, we show that the state-dependent impulsive switching systems can be reduced to the fixed-time ones, and that the global stability of corresponding comparison system implies the same stability of the considered system. On this basis, a novel stability criterion for the considered HNN is established. Finally, two numerical examples are given to demonstrate the effectiveness of our results...
September 2017: Neural Networks: the Official Journal of the International Neural Network Society
Nelly I Zhokhova, Igor I Baskin
In Energy-Based Neural Networks (EBNNs), relationships between variables are captured by means of a scalar function conventionally called "energy". In this article, we introduce a procedure of "harmony search", which looks for compounds providing the lowest energies for the EBNNs trained on active compounds. It can be considered as a special kind of similarity search that takes into account regularities in the structures of active compounds. In this paper, we show that harmony search can be used for performing virtual screening...
June 19, 2017: Molecular Informatics
Masaki Kobayashi
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima...
2017: Computational Intelligence and Neuroscience
Chris Gorman, Anthony Robins, Alistair Knott
We present an investigation of the potential use of Hopfield networks to learn neurally plausible, distributed representations of category prototypes. Hopfield networks are dynamical models of autoassociative memory which learn to recreate a set of input states from any given starting state. These networks, however, will almost always learn states which were not presented during training, so called spurious states. Historically, spurious states have been an undesirable side-effect of training a Hopfield network and there has been much research into detecting and discarding these unwanted states...
April 25, 2017: Neural Networks: the Official Journal of the International Neural Network Society
Marcos Eduardo Valle, Fidelis Zanetti de Castro
In this paper, we first address the dynamics of the elegant multivalued quaternionic Hopfield neural network (MV-QHNN) proposed by Minemoto et al. Contrary to what was expected, we show that the MV-QHNN, as well as one of its variation, does not always come to rest at an equilibrium state under the usual conditions. In fact, we provide simple examples in which the network yields a periodic sequence of quaternionic state vectors. Afterward, we turn our attention to the continuous-valued quaternionic Hopfield neural network (CV-QHNN), which can be derived from the MV-QHNN by means of a limit process...
May 5, 2017: IEEE Transactions on Neural Networks and Learning Systems
Atefeh Taherian Fard, Mark A Ragan
Genome-wide regulatory networks enable cells to function, develop, and survive. Perturbation of these networks can lead to appearance of a disease phenotype. Inspired by Conrad Waddington's epigenetic landscape of cell development, we use a Hopfield network formalism to construct an attractor landscape model of disease progression based on protein- or gene-correlation networks of Parkinson's disease, glioma, and colorectal cancer. Attractors in this landscape correspond to normal and disease states of the cell...
2017: Frontiers in Genetics
Mauro Di Marco, Mauro Forti, Luca Pancioni
Recent papers in the literature introduced a class of neural networks (NNs) with memristors, named dynamic-memristor (DM) NNs, such that the analog processing takes place in the charge-flux domain, instead of the typical current-voltage domain as it happens for Hopfield NNs and standard cellular NNs. One key advantage is that, when a steady state is reached, all currents, voltages, and power of a DM-NN drop off, whereas the memristors act as nonvolatile memories that store the processing result. Previous work in the literature addressed multistability of DM-NNs, i...
April 12, 2017: IEEE Transactions on Neural Networks and Learning Systems
Gabriel Baglietto, Guido Gigante, Paolo Del Giudice
Two, partially interwoven, hot topics in the analysis and statistical modeling of neural data, are the development of efficient and informative representations of the time series derived from multiple neural recordings, and the extraction of information about the connectivity structure of the underlying neural network from the recorded neural activities. In the present paper we show that state-space clustering can provide an easy and effective option for reducing the dimensionality of multiple neural time series, that it can improve inference of synaptic couplings from neural activities, and that it can also allow the construction of a compact representation of the multi-dimensional dynamics, that easily lends itself to complexity measures...
2017: PloS One
Marc Mézard
Motivated by recent progress in using restricted Boltzmann machines as preprocessing algorithms for deep neural network, we revisit the mean-field equations [belief-propagation and Thouless-Anderson Palmer (TAP) equations] in the best understood of such machines, namely the Hopfield model of neural networks, and we explicit how they can be used as iterative message-passing algorithms, providing a fast method to compute the local polarizations of neurons. In the "retrieval phase", where neurons polarize in the direction of one memorized pattern, we point out a major difference between the belief propagation and TAP equations: The set of belief propagation equations depends on the pattern which is retrieved, while one can use a unique set of TAP equations...
February 2017: Physical Review. E
Sheng-Jun Wang, Zhou Yang
We study the stability of patterns in Hopfield networks in which a part of memorized patterns are similar. The similarity between patterns impacts the stability of these patterns, but the stability of other independent patterns is only changed slightly. We show that the stability of patterns is affected in different ways by similarity. For networks storing a number of patterns, the similarity between patterns enhances the pattern stability. However, the stability of patterns can be weakened by the similarity when networks store fewer patterns, and the relation between the stability of patterns and similarity is nonmonotonic...
January 2017: Physical Review. E
Masaki Kobayashi
A complex-valued Hopfield neural network (CHNN) is a multistate model of a Hopfield neural network. It has the disadvantage of low noise tolerance. Meanwhile, a symmetric CHNN (SCHNN) is a modification of a CHNN that improves noise tolerance. Furthermore, a rotor Hopfield neural network (RHNN) is an extension of a CHNN. It has twice the storage capacity of CHNNs and SCHNNs, and much better noise tolerance than CHNNs, although it requires twice many connection parameters. In this brief, we investigate the relations between CHNN, SCHNN, and RHNN; an RHNN is uniquely decomposed into a CHNN and SCHNN...
February 2, 2017: IEEE Transactions on Neural Networks and Learning Systems
Naoki Masuyama, Chu Kiong Loo, Manjeevan Seera, Naoyuki Kubota
Quantum-inspired computing is an emerging research area, which has significantly improved the capabilities of conventional algorithms. In general, quantum-inspired hopfield associative memory (QHAM) has demonstrated quantum information processing in neural structures. This has resulted in an exponential increase in storage capacity while explaining the extensive memory, and it has the potential to illustrate the dynamics of neurons in the human brain when viewed from quantum mechanics perspective although the application of QHAM is limited as an autoassociation...
February 6, 2017: IEEE Transactions on Neural Networks and Learning Systems
Yihong Wang, Rubin Wang, Yating Zhu
Rodent animal can accomplish self-locating and path-finding task by forming a cognitive map in the hippocampus representing the environment. In the classical model of the cognitive map, the system (artificial animal) needs large amounts of physical exploration to study spatial environment to solve path-finding problems, which costs too much time and energy. Although Hopfield's mental exploration model makes up for the deficiency mentioned above, the path is still not efficient enough. Moreover, his model mainly focused on the artificial neural network, and clear physiological meanings has not been addressed...
February 2017: Cognitive Neurodynamics
Ying Wan, Jinde Cao, Guanghui Wen
In this paper, the synchronization problem of master-slave chaotic neural networks with remote sensors, quantization process, and communication time delays is investigated. The information communication channel between the master chaotic neural network and slave chaotic neural network consists of several remote sensors, with each sensor able to access only partial knowledge of output information of the master neural network. At each sampling instants, each sensor updates its own measurement and only one sensor is scheduled to transmit its latest information to the controller's side in order to update the control inputs for the slave neural network...
August 24, 2016: IEEE Transactions on Neural Networks and Learning Systems
Masaki Kobayashi
A complex-valued Hopfield neural network (CHNN) is a model of a Hopfield neural network using multistate neurons. The stability conditions of CHNNs have been widely studied. A CHNN with a synchronous mode will converge to a fixed point or a cycle of length 2. A rotor Hopfield neural network (RHNN) is also a model of a multistate Hopfield neural network. RHNNs have much higher storage capacity and noise tolerance than CHNNs. We extend the theories regarding the stability of CHNNs to RHNNs. In addition, we investigate the stability of RHNNs with the projection rule...
December 29, 2016: IEEE Transactions on Neural Networks and Learning Systems
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"