Read by QxMD icon Read

recurrent neural network

Thomas Miconi
Neural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial...
February 23, 2017: ELife
Louis Kim, Jacob Harer, Akshay Rangamani, James Moran, Philip D Parks, Alik Widge, Emad Eskandar, Darin Dougherty, Sang Peter Chin, Louis Kim, Jacob Harer, Akshay Rangamani, James Moran, Philip D Parks, Alik Widge, Emad Eskandar, Darin Dougherty, Sang Peter Chin, Sang Peter Chin, Jacob Harer, Emad Eskandar, Darin Dougherty, Louis Kim, Philip D Parks, Akshay Rangamani, James Moran, Alik Widge
We present a Recurrent Neural Network using LSTM (Long Short Term Memory) that is capable of modeling and predicting Local Field Potentials. We train and test the network on real data recorded from epilepsy patients. We construct networks that predict multi-channel LFPs for 1, 10, and 100 milliseconds forward in time. Our results show that prediction using LSTM outperforms regression when predicting 10 and 100 millisecond forward in time.
August 2016: Conference Proceedings: Annual International Conference of the IEEE Engineering in Medicine and Biology Society
(no author information available yet)
: Reports an error in "Understanding the neural basis of cognitive bias modification as a clinical treatment for depression" by Akihiro Eguchi, Daniel Walters, Nele Peerenboom, Hannah Dury, Elaine Fox and Simon Stringer (Journal of Consulting and Clinical Psychology, Advanced Online Publication, Dec 19, 2016, np). In the article, there was an error in the Discussion section's first paragraph for Implications and Future Work. The in-text reference citation for Penton-Voak et al. (2013) was incorrectly listed as "Blumenfeld, Preminger, Sagi, and Tsodyks (2006)"...
March 2017: Journal of Consulting and Clinical Psychology
Kristy J Lawton, Wick M Perry, Ayako Yamaguchi, Erik Zornik
Central patterns generators (CPGs) are neural circuits that drive rhythmic motor output without sensory feedback. Vertebrate CPGs are generally believed to operate in a top-down manner in which premotor interneurons activate motor neurons that in turn drive muscles. In contrast, the frog (Xenopus laevis) vocal CPG contains a functionally unexplored neuronal projection from the motor nucleus to the premotor nucleus, indicating a recurrent pathway that may contribute to rhythm generation. In this study we characterized the function of this bottom-up connection...
February 20, 2017: Journal of Neuroscience: the Official Journal of the Society for Neuroscience
Xinyi Le, Jun Wang
This paper presents a two-time-scale neurodynamic approach to constrained minimax optimization using two coupled neural networks. One of the recurrent neural networks is used for minimizing the objective function and another is used for maximization. It is shown that the coupled neurodynamic systems operating in two different time scales work well for minimax optimization. The effectiveness and characteristics of the proposed approach are illustrated using several examples. Furthermore, the proposed approach is applied for H∞ model predictive control...
March 2017: IEEE Transactions on Neural Networks and Learning Systems
Laurence T Hunt, Benjamin Y Hayden
Many accounts of reward-based choice argue for distinct component processes that are serial and functionally localized. In this Opinion article, we argue for an alternative viewpoint, in which choices emerge from repeated computations that are distributed across many brain regions. We emphasize how several features of neuroanatomy may support the implementation of choice, including mutual inhibition in recurrent neural networks and the hierarchical organization of timescales for information processing across the cortex...
February 17, 2017: Nature Reviews. Neuroscience
Yao-Zhong Zhang, Rui Yamaguchi, Seiya Imoto, Satoru Miyano
BACKGROUND: The recent success of deep learning techniques in machine learning and artificial intelligence has stimulated a great deal of interest among bioinformaticians, who now wish to bring the power of deep learning to bare on a host of bioinformatical problems. Deep learning is ideally suited for biological problems that require automatic or hierarchical feature representation for biological data when prior knowledge is limited. In this work, we address the sequence-specific bias correction problem for RNA-seq data redusing Recurrent Neural Networks (RNNs) to model nucleotide sequences without pre-determining sequence structures...
January 25, 2017: BMC Genomics
Sina Radke, Felix Hoffstaedter, Leonie Löffler, Lydia Kogler, Frank Schneider, Jens Blechert, Birgit Derntl
Reappraisal is a particularly effective strategy for influencing emotional experiences, specifically for reducing the impact of negative stimuli. Although depression has repeatedly been linked to dysfunctional behavioral and neural emotion regulation, prefrontal and amygdala engagement seems to vary with clinical characteristics and the specific regulation strategy used. Whereas previous neuroimaging research has focused on down-regulating reactions to emotionally evocative scenes, the current study compared up- and down-regulation in response to angry facial expressions in patients with depression and healthy individuals...
February 14, 2017: Brain Imaging and Behavior
Licheng Wang, Zidong Wang, Guoliang Wei, Fuad E Alsaadi
This paper deals with the event-based finite-time state estimation problem for a class of discrete-time stochastic neural networks with mixed discrete and distributed time delays. In order to mitigate the burden of data communication, a general component-based event-triggered transmission mechanism is proposed to determine whether the measurement output should be released to the estimator at certain time-point according to a specific triggering condition. A new concept of finite-time boundedness in the mean square is put forward to quantify the estimation performance by introducing a settling-like time function...
February 6, 2017: IEEE Transactions on Neural Networks and Learning Systems
Erik Marchi, Fabio Vesperini, Stefano Squartini, Björn Schuller
In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events...
2017: Computational Intelligence and Neuroscience
S Raghu, N Sriraam, G Pradeep Kumar
Electroencephalogram shortly termed as EEG is considered as the fundamental segment for the assessment of the neural activities in the brain. In cognitive neuroscience domain, EEG-based assessment method is found to be superior due to its non-invasive ability to detect deep brain structure while exhibiting superior spatial resolutions. Especially for studying the neurodynamic behavior of epileptic seizures, EEG recordings reflect the neuronal activity of the brain and thus provide required clinical diagnostic information for the neurologist...
February 2017: Cognitive Neurodynamics
S Samarasinghe, H Ling
In this paper, we show how to extend our previously proposed novel continuous time Recurrent Neural Networks (RNN) approach that retains the advantage of continuous dynamics offered by Ordinary Differential Equations (ODE) while enabling parameter estimation through adaptation, to larger signalling networks using a modular approach. Specifically, the signalling network is decomposed into several sub-models based on important temporal events in the network. Each sub-model is represented by the proposed RNN and trained using data generated from the corresponding ODE model...
February 4, 2017: Bio Systems
Guillaume Lajoie, Nedialko I Krouchev, John F Kalaska, Adrienne L Fairhall, Eberhard E Fetz
Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites eventually strengthen. It was also found that effective spike-stimulus delays are consistent with experimentally derived spike-timing-dependent plasticity (STDP) rules, suggesting that STDP is key to drive these changes...
February 2017: PLoS Computational Biology
Leonhard Lücken, David P Rosin, Vasco M Worlitzer, Serhiy Yanchuk
We consider the recurrent pulse-coupled networks of excitable elements with delayed connections, which are inspired by the biological neural networks. If the delays are tuned appropriately, the network can either stay in the steady resting state, or alternatively, exhibit a desired spiking pattern. It is shown that such a network can be used as a pattern-recognition system. More specifically, the application of the correct pattern as an external input to the network leads to a self-sustained reverberation of the encoded pattern...
January 2017: Chaos
Lian Duan, Lihong Huang, Xianwen Fang
In this paper, we study the finite-time synchronization problem for recurrent neural networks with discontinuous activations and time-varying delays. Based on the finite-time convergence theory and by using the nonsmooth analysis technique, some finite-time synchronization criteria for the considered neural network model are established, which are new and complement some existing ones. The feasibility and effectiveness of the proposed synchronization method are supported by two examples with numerical simulations...
January 2017: Chaos
Rajesh Kumar, Smriti Srivastava, J R P Gupta
In this paper adaptive control of nonlinear dynamical systems using diagonal recurrent neural network (DRNN) is proposed. The structure of DRNN is a modification of fully connected recurrent neural network (FCRNN). Presence of self-recurrent neurons in the hidden layer of DRNN gives it an ability to capture the dynamic behaviour of the nonlinear plant under consideration (to be controlled). To ensure stability, update rules are developed using lyapunov stability criterion. These rules are then used for adjusting the various parameters of DRNN...
January 27, 2017: ISA Transactions
Nikolay Chenkov, Henning Sprekeler, Richard Kempter
Complex patterns of neural activity appear during up-states in the neocortex and sharp waves in the hippocampus, including sequences that resemble those during prior behavioral experience. The mechanisms underlying this replay are not well understood. How can small synaptic footprints engraved by experience control large-scale network activity during memory retrieval and consolidation? We hypothesize that sparse and weak synaptic connectivity between Hebbian assemblies are boosted by pre-existing recurrent connectivity within them...
January 2017: PLoS Computational Biology
Andrew Stephen Blaeser, Barry W Connors, Arto V Nurmikko
Cortical systems maintain and process information through the sustained activation of recurrent local networks of neurons. Layer 5 is known to have a major role in generating the recurrent activation associated with these functions, but relatively little is known about its intrinsic dynamics at the mesoscopic level of large numbers of neighboring neurons. Using calcium imaging, we measured the spontaneous activity of networks of deep-layer medial prefrontal cortical neurons in an acute slice model. Inferring the simultaneous activity of tens of neighboring neurons, we found that while the majority showed only sporadic activity, a subset of neurons engaged in sustained delta-frequency rhythmic activity...
January 25, 2017: Journal of Neurophysiology
Veronika Koren, Sophie Denève
Spontaneous activity is commonly observed in a variety of cortical states. Experimental evidence suggested that neural assemblies undergo slow oscillations with Up ad Down states even when the network is isolated from the rest of the brain. Here we show that these spontaneous events can be generated by the recurrent connections within the network and understood as signatures of neural circuits that are correcting their internal representation. A noiseless spiking neural network can represent its input signals most accurately when excitatory and inhibitory currents are as strong and as tightly balanced as possible...
January 2017: PLoS Computational Biology
Liuwei Zhou, Quanyan Zhu, Zhijie Wang, Wuneng Zhou, Hongye Su
This paper discusses the problem of adaptive exponential synchronization in mean square for a new neural network model with the following features: 1) the noise is characterized by the Lévy process and the parameters of the model change in line with the Markovian process; 2) the master system is also disturbed by the same Lévy noise; and 3) there are multiple slave systems, and the state matrix of each slave system is an affine function of the state matrices of all slave systems. Based on the Lyapunov functional theory, the generalized Itô's formula, $M$-matrix method, and the adaptive control technique, some criteria are established to ensure the adaptive exponential synchronization in the mean square of the master system and each slave system...
September 27, 2016: IEEE Transactions on Neural Networks and Learning Systems
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"