keyword
https://read.qxmd.com/read/38250671/stable-fixed-points-of-combinatorial-threshold-linear-networks
#21
JOURNAL ARTICLE
Carina Curto, Jesse Geneson, Katherine Morrison
Combinatorial threshold-linear networks (CTLNs) are a special class of recurrent neural networks whose dynamics are tightly controlled by an underlying directed graph. Recurrent networks have long been used as models for associative memory and pattern completion, with stable fixed points playing the role of stored memory patterns in the network. In prior work, we showed that target-free cliques of the graph correspond to stable fixed points of the dynamics, and we conjectured that these are the only stable fixed points possible [1, 2]...
March 2024: Advances in Applied Mathematics
https://read.qxmd.com/read/38243526/information-content-in-continuous-attractor-neural-networks-is-preserved-in-the-presence-of-moderate-disordered-background-connectivity
#22
JOURNAL ARTICLE
Tobias Kühn, Rémi Monasson
Continuous attractor neural networks (CANN) form an appealing conceptual model for the storage of information in the brain. However a drawback of CANN is that they require finely tuned interactions. We here study the effect of quenched noise in the interactions on the coding of positional information within CANN. Using the replica method we compute the Fisher information for a network with position-dependent input and recurrent connections composed of a short-range (in space) and a disordered component. We find that the loss in positional information is small for not too large disorder strength, indicating that CANN have a regime in which the advantageous effects of local connectivity on information storage outweigh the detrimental ones...
December 2023: Physical Review. E
https://read.qxmd.com/read/38241417/revealing-and-reshaping-attractor-dynamics-in-large-networks-of-cortical-neurons
#23
JOURNAL ARTICLE
Chen Beer, Omri Barak
Attractors play a key role in a wide range of processes including learning and memory. Due to recent innovations in recording methods, there is increasing evidence for the existence of attractor dynamics in the brain. Yet, our understanding of how these attractors emerge or disappear in a biological system is lacking. By following the spontaneous network bursts of cultured cortical networks, we are able to define a vocabulary of spatiotemporal patterns and show that they function as discrete attractors in the network dynamics...
January 19, 2024: PLoS Computational Biology
https://read.qxmd.com/read/38215388/number-of-attractors-in-the-critical-kauffman-model-is-exponential
#24
JOURNAL ARTICLE
T M A Fink, F C Sheldon
The Kauffman model is the archetypal model of genetic computation. It highlights the importance of criticality, at which many biological systems seem poised. In a series of advances, researchers have honed in on how the number of attractors in the critical regime grows with network size. But a definitive answer has remained elusive. We prove that, for the critical Kauffman model with connectivity one, the number of attractors grows at least, and at most, as (2/sqrt[e])^{N}. This is the first proof that the number of attractors in a critical Kauffman model grows exponentially...
December 29, 2023: Physical Review Letters
https://read.qxmd.com/read/38181053/landscape-quantifies-the-intermediate-state-and-transition-dynamics-in-ecological-networks
#25
JOURNAL ARTICLE
Jinchao Lv, Jin Wang, Chunhe Li
Understanding the ecological mechanisms associated with the collapse and restoration is especially critical in promoting harmonious coexistence between humans and nature. So far, it remains challenging to elucidate the mechanisms of stochastic dynamical transitions for ecological systems. Using an example of plant-pollinator network, we quantified the energy landscape of ecological system. The landscape displays multiple attractors characterizing the high, low and intermediate abundance stable states. Interestingly, we detected the intermediate states under pollinator decline, and demonstrated the indispensable role of the intermediate state in state transitions...
January 5, 2024: PLoS Computational Biology
https://read.qxmd.com/read/38152377/boolean-model-of-the-gene-regulatory-network-of-pseudomonas-aeruginosa-ccbh4851
#26
JOURNAL ARTICLE
Márcia da Silva Chagas, Marcelo Trindade Dos Santos, Marcio Argollo de Menezes, Fabricio Alves Barbosa da Silva
INTRODUCTION: Pseudomonas aeruginosa infections are one of the leading causes of death in immunocompromised patients with cystic fibrosis, diabetes, and lung diseases such as pneumonia and bronchiectasis. Furthermore, P. aeruginosa is one of the main multidrug-resistant bacteria responsible for nosocomial infections worldwide, including the multidrug-resistant CCBH4851 strain isolated in Brazil. METHODS: One way to analyze their dynamic cellular behavior is through computational modeling of the gene regulatory network, which represents interactions between regulatory genes and their targets...
2023: Frontiers in Microbiology
https://read.qxmd.com/read/38133664/multistability-in-neural-systems-with-random-cross-connections
#27
JOURNAL ARTICLE
Jordan Breffle, Subhadra Mokashe, Siwei Qiu, Paul Miller
Neural circuits with multiple discrete attractor states could support a variety of cognitive tasks according to both empirical data and model simulations. We assess the conditions for such multistability in neural systems using a firing rate model framework, in which clusters of similarly responsive neurons are represented as single units, which interact with each other through independent random connections. We explore the range of conditions in which multistability arises via recurrent input from other units while individual units, typically with some degree of self-excitation, lack sufficient self-excitation to become bistable on their own...
December 22, 2023: Biological Cybernetics
https://read.qxmd.com/read/38128085/interpersonal-alignment-of-neural-evidence-accumulation-to-social-exchange-of-confidence
#28
JOURNAL ARTICLE
Jamal Esmaily, Sajjad Zabbah, Reza Ebrahimpour, Bahador Bahrami
Private, subjective beliefs about uncertainty have been found to have idiosyncratic computational and neural substrates yet, humans share such beliefs seamlessly and cooperate successfully. Bringing together decision making under uncertainty and interpersonal alignment in communication, in a discovery plus pre-registered replication design, we examined the neuro-computational basis of the relationship between privately held and socially shared uncertainty. Examining confidence-speed-accuracy trade-off in uncertainty-ridden perceptual decisions under social vs isolated context, we found that shared (i...
December 21, 2023: ELife
https://read.qxmd.com/read/38117859/a-dynamic-attractor-network-model-of-memory-formation-reinforcement-and-forgetting
#29
JOURNAL ARTICLE
Marta Boscaglia, Chiara Gastaldi, Wulfram Gerstner, Rodrigo Quian Quiroga
Empirical evidence shows that memories that are frequently revisited are easy to recall, and that familiar items involve larger hippocampal representations than less familiar ones. In line with these observations, here we develop a modelling approach to provide a mechanistic hypothesis of how hippocampal neural assemblies evolve differently, depending on the frequency of presentation of the stimuli. For this, we added an online Hebbian learning rule, background firing activity, neural adaptation and heterosynaptic plasticity to a rate attractor network model, thus creating dynamic memory representations that can persist, increase or fade according to the frequency of presentation of the corresponding memory patterns...
December 20, 2023: PLoS Computational Biology
https://read.qxmd.com/read/38110424/online-real-time-learning-of-dynamical-systems-from-noisy-streaming-data
#30
JOURNAL ARTICLE
S Sinha, S P Nandanoori, D A Barajas-Solano
Recent advancements in sensing and communication facilitate obtaining high-frequency real-time data from various physical systems like power networks, climate systems, biological networks, etc. However, since the data are recorded by physical sensors, it is natural that the obtained data is corrupted by measurement noise. In this paper, we present a novel algorithm for online real-time learning of dynamical systems from noisy time-series data, which employs the Robust Koopman operator framework to mitigate the effect of measurement noise...
December 19, 2023: Scientific Reports
https://read.qxmd.com/read/38077097/precision-data-driven-modeling-of-cortical-dynamics-reveals-idiosyncratic-mechanisms-underlying-canonical-oscillations
#31
Matthew F Singh, Todd S Braver, Michael W Cole, ShiNung Ching
Task-free brain activity affords unique insight into the functional structure of brain network dynamics and is a strong marker of individual differences. In this work, we present an algorithmic optimization framework that makes it possible to directly invert and parameterize brain-wide dynamical-systems models involving hundreds of interacting brain areas, from single-subject time-series recordings. This technique provides a powerful neurocomputational tool for interrogating mechanisms underlying individual brain dynamics ("precision brain models") and making quantitative predictions...
December 2, 2023: bioRxiv
https://read.qxmd.com/read/38070404/network-attractors-and-nonlinear-dynamics-of-neural-computation
#32
REVIEW
Peter Ashwin, Muhammed Fadera, Claire Postlethwaite
The importance of understanding the nonlinear dynamics of neural systems, and the relation to cognitive systems more generally, has been recognised for a long time. Approaches that analyse neural systems in terms of attractors of autonomous networks can be successful in explaining system behaviours in the input-free case. Nonetheless, a computational system usually needs inputs from its environment to effectively solve problems, and this necessitates a non-autonomous framework where typically the effects of a changing environment can be studied...
December 8, 2023: Current Opinion in Neurobiology
https://read.qxmd.com/read/38060791/multi-scroll-attractor-and-its-broken-coexisting-attractors-in-cyclic-memristive-neural-network
#33
JOURNAL ARTICLE
Qiang Lai, Yidan Chen
This paper proposes a simple-structured memristive neural network, which incorporates self-connections of memristor synapses alongside both unidirectional and bidirectional connections. Different from other multi-scroll chaotic systems, this network structure has a more concise three-neuron structure. This simple memristive neural network can generate a number of multi-scroll attractors in manageable quantities and shows the characteristics of the coexisting attractors and amplitude control. In particular, when the parameters are changed, the coexisting attractors break up around the center of gravity into two centrosymmetric chaotic attractors...
August 1, 2023: Chaos
https://read.qxmd.com/read/38060789/the-attractor-structure-of-functional-connectivity-in-coupled-logistic-maps
#34
JOURNAL ARTICLE
Venetia Voutsa, Michail Papadopoulos, Vicky Papadopoulou Lesta, Marc-Thorsten Hütt
Stylized models of dynamical processes on graphs allow us to explore the relationships between network architecture and dynamics, a topic of relevance in a range of disciplines. One strategy is to translate dynamical observations into pairwise relationships of nodes, often called functional connectivity (FC), and quantitatively compare them with network architecture or structural connectivity (SC). Here, we start from the observation that for coupled logistic maps, SC/FC relationships vary strongly with coupling strength...
August 1, 2023: Chaos
https://read.qxmd.com/read/38060773/a-multiplier-free-rulkov-neuron-under-memristive-electromagnetic-induction-dynamics-analysis-energy-calculation-and-circuit-implementation
#35
JOURNAL ARTICLE
Shaohua Zhang, Cong Wang, Hongli Zhang, Hairong Lin
Establishing a realistic and multiplier-free implemented biological neuron model is significant for recognizing and understanding natural firing behaviors, as well as advancing the integration of neuromorphic circuits. Importantly, memristors play a crucial role in constructing memristive neuron and network models by simulating synapses or electromagnetic induction. However, existing models lack the consideration of initial-boosted extreme multistability and its associated energy analysis. To this end, we propose a multiplier-free implementation of the Rulkov neuron model and utilize a periodic memristor to represent the electromagnetic induction effect, thereby achieving the biomimetic modeling of the non-autonomous memristive Rulkov (mRulkov) neuron...
August 1, 2023: Chaos
https://read.qxmd.com/read/38060177/the-search-for-system-s-parameters-statistical-and-dynamical-description-from-complex-network-analysis
#36
JOURNAL ARTICLE
Alessandro Giuliani
The integration of physical and biological science styles is the key for facing the deluge of molecular level information that is becoming a real threat for knowledge advancement. In this work, I will indicate a possible integration path based on the network formalization of molecular knowledge by two different (here named flux and dynamical) perspectives. Some theoretical and applicative cases are presented, focusing on the different physical models implicit in the two network analysis approaches.
2024: Methods in Molecular Biology
https://read.qxmd.com/read/38056422/metabolic-energetics-underlying-attractors-in-neural-models
#37
JOURNAL ARTICLE
Richard B Buxton, Eric C Wong
Neural population modeling, including the role of neural attractors, is a promising tool for understanding many aspects of brain function. We propose a modeling framework to connect the abstract variables used in modeling to recent cellular level estimates of the bioenergetic costs of different aspects of neural activity, measured in ATP consumed per second per neuron. Based on recent work, an empirical reference for brain ATP use for the awake resting brain was estimated as ~2x109 ATP/s-neuron across several mammalian species...
December 6, 2023: Journal of Neurophysiology
https://read.qxmd.com/read/38047510/drive-specific-selection-in-multistable-mechanical-networks
#38
JOURNAL ARTICLE
Hridesh Kedia, Deng Pan, Jean-Jacques Slotine, Jeremy L England
Systems with many stable configurations abound in nature, both in living and inanimate matter, encoding a rich variety of behaviors. In equilibrium, a multistable system is more likely to be found in configurations with lower energy, but the presence of an external drive can alter the relative stability of different configurations in unexpected ways. Living systems are examples par excellence of metastable nonequilibrium attractors whose structure and stability are highly dependent on the specific form and pattern of the energy flow sustaining them...
December 7, 2023: Journal of Chemical Physics
https://read.qxmd.com/read/38024449/short-term-postsynaptic-plasticity-facilitates-predictive-tracking-in-continuous-attractors
#39
JOURNAL ARTICLE
Huilin Zhao, Sungchil Yang, Chi Chung Alan Fung
INTRODUCTION: The N-methyl-D-aspartate receptor (NMDAR) plays a critical role in synaptic transmission and is associated with various neurological and psychiatric disorders. Recently, a novel form of postsynaptic plasticity known as NMDAR-based short-term postsynaptic plasticity (STPP) has been identified. It has been suggested that long-lasting glutamate binding to NMDAR allows for the retention of input information in brain slices up to 500 ms, leading to response facilitation. However, the impact of STPP on the dynamics of neuronal populations remains unexplored...
2023: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/38014290/flow-field-inference-from-neural-data-using-deep-recurrent-networks
#40
Timothy Doyeon Kim, Thomas Zhihao Luo, Tankut Can, Kamesh Krishnamurthy, Jonathan W Pillow, Carlos D Brody
Computations involved in processes such as decision-making, working memory, and motor control are thought to emerge from the dynamics governing the collective activity of neurons in large populations. But the estimation of these dynamics remains a significant challenge. Here we introduce Flow-field Inference from Neural Data using deep Recurrent networks (FINDR), an unsupervised deep learning method that can infer low-dimensional nonlinear stochastic dynamics underlying neural population activity. Using population spike train data from frontal brain regions of rats performing an auditory decision-making task, we demonstrate that FINDR outperforms existing methods in capturing the heterogeneous responses of individual neurons...
November 16, 2023: bioRxiv
keyword
keyword
41275
2
3
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.