Read by QxMD icon Read

Journal of Mathematical Neuroscience

Markus Breit, Gillian Queisser
Neuronal calcium signals propagating by simple diffusion and reaction with mobile and stationary buffers are limited to cellular microdomains. The distance intracellular calcium signals can travel may be significantly increased by means of calcium-induced calcium release from internal calcium stores, notably the endoplasmic reticulum. The organelle, which can be thought of as a cell-within-a-cell, is able to sequester large amounts of cytosolic calcium ions via SERCA pumps and selectively release them into the cytosol through ryanodine receptor channels leading to the formation of calcium waves...
July 13, 2018: Journal of Mathematical Neuroscience
Andrea K Barreiro, Cheng Ly
The structure of spiking activity in cortical networks has important implications for how the brain ultimately codes sensory signals. However, our understanding of how network and intrinsic cellular mechanisms affect spiking is still incomplete. In particular, whether cell pairs in a neural network show a positive (or no) relationship between pairwise spike count correlation and average firing rate is generally unknown. This relationship is important because it has been observed experimentally in some sensory systems, and it can enhance information in a common population code...
June 6, 2018: Journal of Mathematical Neuroscience
Elham Bayat Mokhtari, J Josh Lawrence, Emily F Stone
Neurons in a micro-circuit connected by chemical synapses can have their connectivity affected by the prior activity of the cells. The number of synapses available for releasing neurotransmitter can be decreased by repetitive activation through depletion of readily releasable neurotransmitter (NT), or increased through facilitation, where the probability of release of NT is increased by prior activation. These competing effects can create a complicated and subtle range of time-dependent connectivity. Here we investigate the probabilistic properties of facilitation and depression (FD) for a presynaptic neuron that is receiving a Poisson spike train of input...
May 29, 2018: Journal of Mathematical Neuroscience
Kathryn Hedrick, Kechen Zhang
The theory of attractor neural networks has been influential in our understanding of the neural processes underlying spatial, declarative, and episodic memory. Many theoretical studies focus on the inherent properties of an attractor, such as its structure and capacity. Relatively little is known about how an attractor neural network responds to external inputs, which often carry conflicting information about a stimulus. In this paper we analyze the behavior of an attractor neural network driven by two conflicting external inputs...
May 16, 2018: Journal of Mathematical Neuroscience
Cris R Hasan, Bernd Krauskopf, Hinke M Osinga
Many physiological phenomena have the property that some variables evolve much faster than others. For example, neuron models typically involve observable differences in time scales. The Hodgkin-Huxley model is well known for explaining the ionic mechanism that generates the action potential in the squid giant axon. Rubin and Wechselberger (Biol. Cybern. 97:5-32, 2007) nondimensionalized this model and obtained a singularly perturbed system with two fast, two slow variables, and an explicit time-scale ratio ε...
April 19, 2018: Journal of Mathematical Neuroscience
Carlo R Laing
We consider finite and infinite all-to-all coupled networks of identical theta neurons. Two types of synaptic interactions are investigated: instantaneous and delayed (via first-order synaptic processing). Extensive use is made of the Watanabe/Strogatz (WS) ansatz for reducing the dimension of networks of identical sinusoidally-coupled oscillators. As well as the degeneracy associated with the constants of motion of the WS ansatz, we also find continuous families of solutions for instantaneously coupled neurons, resulting from the reversibility of the reduced model and the form of the synaptic input...
February 5, 2018: Journal of Mathematical Neuroscience
Jehan Alswaihli, Roland Potthast, Ingo Bojak, Douglas Saddy, Axel Hutt
Understanding the neural field activity for realistic living systems is a challenging task in contemporary neuroscience. Neural fields have been studied and developed theoretically and numerically with considerable success over the past four decades. However, to make effective use of such models, we need to identify their constituents in practical systems. This includes the determination of model parameters and in particular the reconstruction of the underlying effective connectivity in biological tissues.In this work, we provide an integral equation approach to the reconstruction of the neural connectivity in the case where the neural activity is governed by a delay neural field equation...
February 5, 2018: Journal of Mathematical Neuroscience
Aurel A Lazar, Nikul H Ukani, Yiyin Zhou
We investigate the sparse functional identification of complex cells and the decoding of spatio-temporal visual stimuli encoded by an ensemble of complex cells. The reconstruction algorithm is formulated as a rank minimization problem that significantly reduces the number of sampling measurements (spikes) required for decoding. We also establish the duality between sparse decoding and functional identification and provide algorithms for identification of low-rank dendritic stimulus processors. The duality enables us to efficiently evaluate our functional identification algorithms by reconstructing novel stimuli in the input space...
January 18, 2018: Journal of Mathematical Neuroscience
Christopher J Hillar, Ngoc M Tran
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size...
January 16, 2018: Journal of Mathematical Neuroscience
Koen Dijkstra, Yuri A Kuznetsov, Michel J A M van Putten, Stephan A van Gils
We present a simple rate-reduced neuron model that captures a wide range of complex, biologically plausible, and physiologically relevant spiking behavior. This includes spike-frequency adaptation, postinhibitory rebound, phasic spiking and accommodation, first-spike latency, and inhibition-induced spiking. Furthermore, the model can mimic different neuronal filter properties. It can be used to extend existing neural field models, adding more biological realism and yielding a richer dynamical structure. The model is based on a slight variation of the Rulkov map...
December 11, 2017: Journal of Mathematical Neuroscience
Maria Luisa Saggio, Andreas Spiegler, Christophe Bernard, Viktor K Jirsa
Bursting is a phenomenon found in a variety of physical and biological systems. For example, in neuroscience, bursting is believed to play a key role in the way information is transferred in the nervous system. In this work, we propose a model that, appropriately tuned, can display several types of bursting behaviors. The model contains two subsystems acting at different time scales. For the fast subsystem we use the planar unfolding of a high codimension singularity. In its bifurcation diagram, we locate paths that underlie the right sequence of bifurcations necessary for bursting...
December 2017: Journal of Mathematical Neuroscience
Bjørn Fredrik Nielsen
Point neuron models with a Heaviside firing rate function can be ill-posed. That is, the initial-condition-to-solution map might become discontinuous in finite time. If a Lipschitz continuous but steep firing rate function is employed, then standard ODE theory implies that such models are well-posed and can thus, approximately, be solved with finite precision arithmetic. We investigate whether the solution of this well-posed model converges to a solution of the ill-posed limit problem as the steepness parameter of the firing rate function tends to infinity...
December 2017: Journal of Mathematical Neuroscience
Eva Lang, Wilhelm Stannat
Neural field equations are used to describe the spatio-temporal evolution of the activity in a network of synaptically coupled populations of neurons in the continuum limit. Their heuristic derivation involves two approximation steps. Under the assumption that each population in the network is large, the activity is described in terms of a population average. The discrete network is then approximated by a continuum. In this article we make the two approximation steps explicit. Extending a model by Bressloff and Newby, we describe the evolution of the activity in a discrete network of finite populations by a Markov chain...
December 2017: Journal of Mathematical Neuroscience
Arthur S Sherman, Joon Ha
Low frequency firing is modeled by Type 1 neurons with a SNIC, but, because of the vertical slope of the square-root-like f-I curve, low f only occurs over a narrow range of I. When an adaptive current is added, however, the f-I curve is linearized, and low f occurs robustly over a large I range. Ermentrout (Neural Comput. 10(7):1721-1729, 1998) showed that this feature of adaptation paradoxically arises from the SNIC that is responsible for the vertical slope. We show, using a simplified Hindmarsh-Rose neuron with negative feedback acting directly on the adaptation current, that whereas a SNIC contributes to linearization, in practice linearization over a large interval may require strong adaptation strength...
December 2017: Journal of Mathematical Neuroscience
Yangyang Wang, Jonathan E Rubin
Neural networks generate a variety of rhythmic activity patterns, often involving different timescales. One example arises in the respiratory network in the pre-Bötzinger complex of the mammalian brainstem, which can generate the eupneic rhythm associated with normal respiration as well as recurrent low-frequency, large-amplitude bursts associated with sighing. Two competing hypotheses have been proposed to explain sigh generation: the recruitment of a neuronal population distinct from the eupneic rhythm-generating subpopulation or the reconfiguration of activity within a single population...
December 2017: Journal of Mathematical Neuroscience
Lawrence C Udeigwe, Paul W Munro, G Bard Ermentrout
The Bienenstock-Cooper-Munro (BCM) learning rule provides a simple setup for synaptic modification that combines a Hebbian product rule with a homeostatic mechanism that keeps the weights bounded. The homeostatic part of the learning rule depends on the time average of the post-synaptic activity and provides a sliding threshold that distinguishes between increasing or decreasing weights. There are, thus, two essential time scales in the BCM rule: a homeostatic time scale, and a synaptic modification time scale...
December 2017: Journal of Mathematical Neuroscience
Jonathan Cannon, Paul Miller
Homeostatic processes that provide negative feedback to regulate neuronal firing rates are essential for normal brain function. Indeed, multiple parameters of individual neurons, including the scale of afferent synapse strengths and the densities of specific ion channels, have been observed to change on homeostatic time scales to oppose the effects of chronic changes in synaptic input. This raises the question of whether these processes are controlled by a single slow feedback variable or multiple slow variables...
December 2017: Journal of Mathematical Neuroscience
Aytül Gökçe, Daniele Avitabile, Stephen Coombes
Continuum neural field equations model the large-scale spatio-temporal dynamics of interacting neurons on a cortical surface. They have been extensively studied, both analytically and numerically, on bounded as well as unbounded domains. Neural field models do not require the specification of boundary conditions. Relatively little attention has been paid to the imposition of neural activity on the boundary, or to its role in inducing patterned states. Here we redress this imbalance by studying neural field models of Amari type (posed on one- and two-dimensional bounded domains) with Dirichlet boundary conditions...
October 26, 2017: Journal of Mathematical Neuroscience
Anirban Nandi, Heinz Schättler, Jason T Ritt, ShiNung Ching
No abstract text is available yet for this article.
October 11, 2017: Journal of Mathematical Neuroscience
Andrea K Barreiro, J Nathan Kutz, Eli Shlizerman
We examine a family of random firing-rate neural networks in which we enforce the neurobiological constraint of Dale's Law-each neuron makes either excitatory or inhibitory connections onto its post-synaptic targets. We find that this constrained system may be described as a perturbation from a system with nontrivial symmetries. We analyze the symmetric system using the tools of equivariant bifurcation theory and demonstrate that the symmetry-implied structures remain evident in the perturbed system. In comparison, spectral characteristics of the network coupling matrix are relatively uninformative about the behavior of the constrained system...
October 10, 2017: Journal of Mathematical Neuroscience
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"