Read by QxMD icon Read

recurrent neural network

André David Kovac, Maximilian Koall, Gordon Pipa, Hazem Toutounji
Delays are ubiquitous in biological systems, ranging from genetic regulatory networks and synaptic conductances, to predator/pray population interactions. The evidence is mounting, not only to the presence of delays as physical constraints in signal propagation speed, but also to their functional role in providing dynamical diversity to the systems that comprise them. The latter observation in biological systems inspired the recent development of a computational architecture that harnesses this dynamical diversity, by delay-coupling a single nonlinear element to itself...
2016: PloS One
Qikang Wei, Tao Chen, Ruifeng Xu, Yulan He, Lin Gui
The recognition of disease and chemical named entities in scientific articles is a very important subtask in information extraction in the biomedical domain. Due to the diversity and complexity of disease names, the recognition of named entities of diseases is rather tougher than those of chemical names. Although there are some remarkable chemical named entity recognition systems available online such as ChemSpot and tmChem, the publicly available recognition systems of disease named entities are rare. This article presents a system for disease named entity recognition (DNER) and normalization...
2016: Database: the Journal of Biological Databases and Curation
Sitian Qin, Yadong Liu, Xiaoping Xue, Fuqiang Wang
This paper presents a neurodynamic approach with a recurrent neural network for solving convex optimization problems with general constraint. It is proved that for any initial point, the state of the proposed neural network reaches the constraint set in finite time, and converges to an optimal solution of the convex optimization problem finally. In contrast to the existing related neural networks, the convergence rate of the state of the proposed neural network can be calculated quantitatively via the Łojasiewicz exponent under some mild assumptions...
September 9, 2016: Neural Networks: the Official Journal of the International Neural Network Society
Yin Sheng, Yi Shen, Mingfu Zhu
This paper deals with the global exponential stability for delayed recurrent neural networks (DRNNs). By constructing an augmented Lyapunov-Krasovskii functional and adopting the reciprocally convex combination approach and Wirtinger-based integral inequality, delay-dependent global exponential stability criteria are derived in terms of linear matrix inequalities. Meanwhile, a general and effective method on global exponential stability analysis for DRNNs is given through a lemma, where the exponential convergence rate can be estimated...
September 29, 2016: IEEE Transactions on Neural Networks and Learning Systems
Namita Multani, Frank Rudzicz, Wing Yiu Stephanie Wong, Aravind Kumar Namasivayam, Pascal van Lieshout
Purpose: Random item generation (RIG) involves central executive functioning. Measuring aspects of random sequences can therefore provide a simple method to complement other tools for cognitive assessment. We examine the extent to which RIG relates to specific measures of cognitive function, and whether those measures can be estimated using RIG only. Method: Twelve healthy older adults (age: M = 70.3 years, SD = 4.9; 8 women and 4 men) and 20 healthy young adults (age: M = 24 years, SD = 4...
September 28, 2016: Journal of Speech, Language, and Hearing Research: JSLHR
Axel Hutt, Andreas Mierau, Jérémie Lefebvre
Oscillatory brain activity is believed to play a central role in neural coding. Accumulating evidence shows that features of these oscillations are highly dynamic: power, frequency and phase fluctuate alongside changes in behavior and task demands. The role and mechanism supporting this variability is however poorly understood. We here analyze a network of recurrently connected spiking neurons with time delay displaying stable synchronous dynamics. Using mean-field and stability analyses, we investigate the influence of dynamic inputs on the frequency of firing rate oscillations...
2016: PloS One
Subhrajit Roy, Arindam Basu
In this letter, we propose a novel neuro-inspired low-resolution online unsupervised learning rule to train the reservoir or liquid of liquid state machines. The liquid is a sparsely interconnected huge recurrent network of spiking neurons. The proposed learning rule is inspired from structural plasticity and trains the liquid through formating and eliminating synaptic connections. Hence, the learning involves rewiring of the reservoir connections similar to structural plasticity observed in biological neural networks...
September 14, 2016: Neural Computation
Yuwei Cui, Subutar Ahmad, Jeff Hawkins
The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variableorder temporal sequences using an unsupervised Hebbian-like learning rule...
September 14, 2016: Neural Computation
Masoud Amiri, Mahmood Amiri, Soheila Nazari, Karim Faez
Hyper-synchronous neural oscillations are the character of several neurological diseases such as epilepsy. On the other hand, glial cells and particularly astrocytes can influence neural synchronization. Therefore, based on the recent researches, a new bio-inspired stimulator is proposed which basically is a dynamical model of the astrocyte biophysical model. The performance of the new stimulator is investigated on a large-scale, cortical network. Both excitatory and inhibitory synapses are also considered in the simulated spiking neural network...
September 13, 2016: Journal of Theoretical Biology
Igor Farkaš, Radomír Bosák, Peter Gergeľ
Reservoir computing became very popular due to its potential for efficient design of recurrent neural networks, exploiting the computational properties of the reservoir structure. Various approaches, ranging from appropriate reservoir initialization to its optimization by training have been proposed. In this paper, we extend our previous work and focus on short-term memory capacity, introduced by Jaeger in case of echo state networks. Memory capacity has been previously shown to peak at criticality, when the network switches from a stable regime to an unstable dynamic regime...
November 2016: Neural Networks: the Official Journal of the International Neural Network Society
Philippe Vincent-Lamarre, Guillaume Lajoie, Jean-Philippe Thivierge
A large body of experimental and theoretical work on neural coding suggests that the information stored in brain circuits is represented by time-varying patterns of neural activity. Reservoir computing, where the activity of a recurrently connected pool of neurons is read by one or more units that provide an output response, successfully exploits this type of neural activity. However, the question of system robustness to small structural perturbations, such as failing neurons and synapses, has been largely overlooked...
September 2, 2016: Journal of Computational Neuroscience
Pierre Baldi, Peter Sadowski
In a physical neural system, where storage and processing are intimately intertwined, the rules for adjusting the synaptic weights can only depend on variables that are available locally, such as the activity of the pre- and post-synaptic neurons, resulting in local learning rules. A systematic framework for studying the space of local learning rules is obtained by first specifying the nature of the local variables, and then the functional form that ties them together into each learning rule. Such a framework enables also the systematic discovery of new learning rules and exploration of relationships between learning rules and group symmetries...
November 2016: Neural Networks: the Official Journal of the International Neural Network Society
Po-Lung Tien
In this paper, we propose a novel discrete-time recurrent neural network aiming to resolve a new class of multi-constrained K-winner-take-all (K-WTA) problems. By facilitating specially designed asymmetric neuron weights, the proposed model is capable of operating in a fully parallel manner, thereby allowing true digital implementation. This paper also provides theorems that delineate the theoretical upper bound of the convergence latency, which is merely O(K). Importantly, via simulations, the average convergence time is close to O(1) in most general cases...
August 26, 2016: IEEE Transactions on Neural Networks and Learning Systems
Khalid Raza, Mansaf Alam
One of the exciting problems in systems biology research is to decipher how genome controls the development of complex biological system. The gene regulatory networks (GRNs) help in the identification of regulatory interactions between genes and offer fruitful information related to functional role of individual gene in a cellular system. Discovering GRNs lead to a wide range of applications, including identification of disease related pathways providing novel tentative drug targets, helps to predict disease response, and also assists in diagnosing various diseases including cancer...
August 16, 2016: Computational Biology and Chemistry
Stefan L Frank, Hartmut Fitz
Prior language input is not lost but integrated with the current input. This principle is demonstrated by "reservoir computing": Untrained recurrent neural networks project input sequences onto a random point in high-dimensional state space. Earlier inputs can be retrieved from this projection, albeit less reliably so as more input is received. The bottleneck is therefore not "Now-or-Never" but "Sooner-is-Better."
January 2016: Behavioral and Brain Sciences
Sitian Qin, Xinyi Le, Jun Wang
This paper presents a neurodynamic optimization approach to bilevel quadratic programming (BQP). Based on the Karush-Kuhn-Tucker (KKT) theorem, the BQP problem is reduced to a one-level mathematical program subject to complementarity constraints (MPCC). It is proved that the global solution of the MPCC is the minimal one of the optimal solutions to multiple convex optimization subproblems. A recurrent neural network is developed for solving these convex optimization subproblems. From any initial state, the state of the proposed neural network is convergent to an equilibrium point of the neural network, which is just the optimal solution of the convex optimization subproblem...
August 19, 2016: IEEE Transactions on Neural Networks and Learning Systems
Chung-Chuan Lo, Xiao-Jing Wang
Automatic responses enable us to react quickly and effortlessly, but they often need to be inhibited so that an alternative, voluntary action can take place. To investigate the brain mechanism of controlled behavior, we investigated a biologically-based network model of spiking neurons for inhibitory control. In contrast to a simple race between pro- versus anti-response, our model incorporates a sensorimotor remapping module, and an action-selection module endowed with a "Stop" process through tonic inhibition...
August 2016: PLoS Computational Biology
Caigen Zhou, Xiaoqin Zeng, Chaomin Luo, Huaguang Zhang
In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving (2m)ⁿ memory capacity...
August 2, 2016: IEEE Transactions on Neural Networks and Learning Systems
Edward Choi, Andy Schuetz, Walter F Stewart, Jimeng Sun
OBJECTIVE: We explored whether use of deep learning to model temporal relations among events in electronic health records (EHRs) would improve model performance in predicting initial diagnosis of heart failure (HF) compared to conventional methods that ignore temporality. MATERIALS AND METHODS: Data were from a health system's EHR on 3884 incident HF cases and 28 903 controls, identified as primary care patients, between May 16, 2000, and May 23, 2013. Recurrent neural network (RNN) models using gated recurrent units (GRUs) were adapted to detect relations among time-stamped events (eg, disease diagnosis, medication orders, procedure orders, etc...
August 13, 2016: Journal of the American Medical Informatics Association: JAMIA
Jeremy M Barry, Gregory L Holmes
The epileptic encephalopathies are devastating conditions characterized by frequent seizures, severely abnormal electroencephalograms (EEGs), and cognitive slowing or regression. The cognitive impairment in the epileptic encephalopathies may be more concerning to the patient and parents than the epilepsy itself. There is increasing recognition that the cognitive comorbidity can be both chronic, primarily due to the underlying etiology of the epilepsy, and dynamic or evolving because of recurrent seizures, interictal spikes, and antiepileptic drugs...
November 2016: Journal of Child Neurology
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"