Read by QxMD icon Read

Attractor network

Zhuocheng Xiao, Jiwei Zhang, Andrew T Sornborger, Louis Tao
Line attractors in neuronal networks have been suggested to be the basis of many brain functions, such as working memory, oculomotor control, head movement, locomotion, and sensory processing. In this paper, we make the connection between line attractors and pulse gating in feed-forward neuronal networks. In this context, because of their neutral stability along a one-dimensional manifold, line attractors are associated with a time-translational invariance that allows graded information to be propagated from one neuronal population to the next...
November 2017: Physical Review. E
Peter Ashwin, Jennifer Creaser, Krasimira Tsaneva-Atanasova
It is well known that the addition of noise to a multistable dynamical system can induce random transitions from one stable state to another. For low noise, the times between transitions have an exponential tail and Kramers' formula gives an expression for the mean escape time in the asymptotic limit. If a number of multistable systems are coupled into a network structure, a transition at one site may change the transition properties at other sites. We study the case of escape from a "quiescent" attractor to an "active" attractor in which transitions back can be ignored...
November 2017: Physical Review. E
R I M Dunbar, Padraig Mac Carron, Susanne Shultz
Primate groups vary considerably in size across species. Nonetheless, the distribution of mean species group size has a regular scaling pattern with preferred sizes approximating 2.5, 5, 15, 30 and 50 individuals (although strepsirrhines lack the latter two), with a scaling ratio of approximately 2.5 similar to that observed in human social networks. These clusters appear to form distinct social grades that are associated with rapid evolutionary change, presumably in response to intense environmental selection pressures...
January 2018: Biology Letters
Christopher J Hillar, Ngoc M Tran
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size...
January 16, 2018: Journal of Mathematical Neuroscience
Călin-Adrian Popa, Eva Kaslik
The existence of multiple exponentially stable equilibrium states and periodic solutions is investigated for Hopfield-type quaternion-valued neural networks (QVNNs) with impulsive effects and both time-dependent and distributed delays. Employing Brouwer's and Leray-Schauder's fixed point theorems, suitable Lyapunov functionals and impulsive control theory, sufficient conditions are given for the existence of 16n attractors, showing a substantial improvement in storage capacity, compared to real-valued or complex-valued neural networks...
December 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
Arnau Montagud, Pauline Traynard, Loredana Martignetti, Eric Bonnet, Emmanuel Barillot, Andrei Zinovyev, Laurence Calzone
Mathematical models can serve as a tool to formalize biological knowledge from diverse sources, to investigate biological questions in a formal way, to test experimental hypotheses, to predict the effect of perturbations and to identify underlying mechanisms. We present a pipeline of computational tools that performs a series of analyses to explore a logical model's properties. A logical model of initiation of the metastatic process in cancer is used as a transversal example. We start by analysing the structure of the interaction network constructed from the literature or existing databases...
December 8, 2017: Briefings in Bioinformatics
Edmund T Rolls
A quantitative computational theory of the operation of the hippocampus as an episodic memory system is described. The CA3 system operates as a single attractor or autoassociation network (1) to enable rapid one-trial associations between any spatial location (place in rodents or spatial view in primates) and an object or reward and (2) to provide for completion of the whole memory during recall from any part. The theory is extended to associations between time and object or reward to implement temporal order memory, which is also important in episodic memory...
December 7, 2017: Cell and Tissue Research
Oliver L C Rourke, Daniel A Butts
The ability of sensory networks to transiently store information on the scale of seconds can confer many advantages in processing time-varying stimuli. How a network could store information on such intermediate time scales, between typical neurophysiological time scales and those of long-term memory, is typically attributed to persistent neural activity. An alternative mechanism which might allow for such information storage is through temporary modifications to the neural connectivity which decay on the same second-long time scale as the underlying memories...
2017: PloS One
Taiping Zeng, Bailu Si
It is a challenge to build robust simultaneous localization and mapping (SLAM) system in dynamical large-scale environments. Inspired by recent findings in the entorhinal-hippocampal neuronal circuits, we propose a cognitive mapping model that includes continuous attractor networks of head-direction cells and conjunctive grid cells to integrate velocity information by conjunctive encodings of space and movement. Visual inputs from the local view cells in the model provide feedback cues to correct drifting errors of the attractors caused by the noisy velocity inputs...
2017: Frontiers in Neurorobotics
Parul Maheshwari, Réka Albert
BACKGROUND: Cellular behaviors are governed by interaction networks among biomolecules, for example gene regulatory and signal transduction networks. An often used dynamic modeling framework for these networks, Boolean modeling, can obtain their attractors (which correspond to cell types and behaviors) and their trajectories from an initial state (e.g. a resting state) to the attractors, for example in response to an external signal. The existing methods however do not elucidate the causal relationships between distant nodes in the network...
December 6, 2017: BMC Systems Biology
Minsoo Choi, Jue Shi, Yanting Zhu, Ruizhen Yang, Kwang-Hyun Cho
Cancer is a complex disease involving multiple genomic alterations that disrupt the dynamic response of signaling networks. The heterogeneous nature of cancer, which results in highly variable drug response, is a major obstacle to developing effective cancer therapy. Previous studies of cancer therapeutic response mostly focus on static analysis of genome-wide alterations, thus they are unable to unravel the dynamic, network-specific origin of variation. Here we present a network dynamics-based approach to integrate cancer genomics with dynamics of biological network for drug response prediction and design of drug combination...
December 5, 2017: Nature Communications
Peter Csermely
I hypothesize that re-occurring prior experience of complex systems mobilizes a fast response, whose attractor is encoded by their strongly connected network core. In contrast, responses to novel stimuli are often slow and require the weakly connected network periphery. Upon repeated stimulus, peripheral network nodes remodel the network core that encodes the attractor of the new response. This "core-periphery learning" theory reviews and generalizes the heretofore fragmented knowledge on attractor formation by neural networks, periphery-driven innovation, and a number of recent reports on the adaptation of protein, neuronal, and social networks...
November 23, 2017: BioEssays: News and Reviews in Molecular, Cellular and Developmental Biology
Dylan R Muir
Recurrent neural network architectures can have useful computational properties, with complex temporal dynamics and input-sensitive attractor states. However, evaluation of recurrent dynamic architectures requires solving systems of differential equations, and the number of evaluations required to determine their response to a given input can vary with the input or can be indeterminate altogether in the case of oscillations or instability. In feedforward networks, by contrast, only a single pass through the network is needed to determine the response to a given input...
November 21, 2017: Neural Computation
Anthony Szedlak, Spencer Sims, Nicholas Smith, Giovanni Paternostro, Carlo Piermarocchi
Modern time series gene expression and other omics data sets have enabled unprecedented resolution of the dynamics of cellular processes such as cell cycle and response to pharmaceutical compounds. In anticipation of the proliferation of time series data sets in the near future, we use the Hopfield model, a recurrent neural network based on spin glasses, to model the dynamics of cell cycle in HeLa (human cervical cancer) and S. cerevisiae cells. We study some of the rich dynamical properties of these cyclic Hopfield systems, including the ability of populations of simulated cells to recreate experimental expression data and the effects of noise on the dynamics...
November 17, 2017: PLoS Computational Biology
Meng-Li Zheng, Nai-Kang Zhou, De-Liang Huang, Cheng-Hua Luo
PURPOSE: The purpose of this study was to explore the pathway cross-talks and key pathways in non-small cell lung cancer (NSCLC) to better understand the underlying pathological mechanism. METHODS: Integrated gene expression data, pathway data and protein-protein interaction (PPI) data were assessed to identify the pathway regulatory interactions in NSCLC, and constructed the background and disease pathway crosstalk networks, respectively. In this work, the attractor method was implemented to identified the differential pathways, and the rank product (RP) algorithm was used to determine the importance of pathways...
September 2017: Journal of B.U.ON.: Official Journal of the Balkan Union of Oncology
Michael E Hasselmo, James R Hinman, Holger Dannenberg, Chantal E Stern
Episodic memory involves coding of the spatial location and time of individual events. Coding of space and time is also relevant to working memory, spatial navigation, and the disambiguation of overlapping memory representations. Neurophysiological data demonstrate that neuronal activity codes the current, past and future location of an animal as well as temporal intervals within a task. Models have addressed how neural coding of space and time for memory function could arise, with both dimensions coded by the same neurons...
October 2017: Current Opinion in Behavioral Sciences
Alexandre A P Rodrigues
In the framework of the generalized Lotka-Volterra model, solutions representing multispecies sequential competition can be predictable with high probability. In this paper, we show that it occurs because the corresponding "heteroclinic channel" forms part of an attractor. We prove that, generically, in an attracting heteroclinic network involving a finite number of hyperbolic and non-resonant saddle-equilibria whose linearization has only real eigenvalues, the connections corresponding to the most positive expanding eigenvalues form part of an attractor (observable in numerical simulations)...
October 2017: Chaos
Thomas Rost, Moritz Deger, Martin P Nawrot
Balanced networks are a frequently employed basic model for neuronal networks in the mammalian neocortex. Large numbers of excitatory and inhibitory neurons are recurrently connected so that the numerous positive and negative inputs that each neuron receives cancel out on average. Neuronal firing is therefore driven by fluctuations in the input and resembles the irregular and asynchronous activity observed in cortical in vivo data. Recently, the balanced network model has been extended to accommodate clusters of strongly interconnected excitatory neurons in order to explain persistent activity in working memory-related tasks...
October 26, 2017: Biological Cybernetics
Lorenz Gönner, Julien Vitay, Fred H Hamker
Hippocampal place-cell sequences observed during awake immobility often represent previous experience, suggesting a role in memory processes. However, recent reports of goals being overrepresented in sequential activity suggest a role in short-term planning, although a detailed understanding of the origins of hippocampal sequential activity and of its functional role is still lacking. In particular, it is unknown which mechanism could support efficient planning by generating place-cell sequences biased toward known goal locations, in an adaptive and constructive fashion...
2017: Frontiers in Computational Neuroscience
Minkyu Choi, Jun Tani
This letter proposes a novel predictive coding type neural network model, the predictive multiple spatiotemporal scales recurrent neural network (P-MSTRNN). The P-MSTRNN learns to predict visually perceived human whole-body cyclic movement patterns by exploiting multiscale spatiotemporal constraints imposed on network dynamics by using differently sized receptive fields as well as different time constant values for each layer. After learning, the network can imitate target movement patterns by inferring or recognizing corresponding intentions by means of the regression of prediction error...
October 24, 2017: Neural Computation
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"