keyword
MENU ▼
Read by QxMD icon Read
search

recurrent neural network

keyword
https://www.readbyqxmd.com/read/29340803/robust-exponential-memory-in-hopfield-networks
#1
Christopher J Hillar, Ngoc M Tran
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size...
January 16, 2018: Journal of Mathematical Neuroscience
https://www.readbyqxmd.com/read/29335031/respiration-pattern-variability-and-related-default-mode-network-connectivity-are-altered-in-remitted-depression
#2
Vera Eva Zamoscik, Stephanie Nicole Lyn Schmidt, Martin Fungisai Gerchen, Christos Samsouris, Christina Timm, Christine Kuehner, Peter Kirsch
BACKGROUND: Studies with healthy participants and patients with respiratory diseases suggest a relation between respiration and mood. The aim of the present analyses was to investigate whether emotionally challenged remitted depressed participants show higher respiration pattern variability (RPV) and whether this is related to mood, clinical outcome and increased default mode network connectivity. METHODS: To challenge participants, sad mood was induced with keywords of personal negative life events in individuals with remitted depression [recurrent major depressive disorder (rMDD), n = 30] and matched healthy controls (HCs, n = 30) during functional magnetic resonance imaging...
January 16, 2018: Psychological Medicine
https://www.readbyqxmd.com/read/29330489/decoding-hind-limb-kinematics-from-neuronal-activity-of-the-dorsal-horn-neurons-using-multiple-level-learning-algorithm
#3
Hamed Yeganegi, Yaser Fathi, Abbas Erfanian
Decoding continuous hind limb joint angles from sensory recordings of neural system provides a feedback for closed-loop control of hind limb movement using functional electrical stimulation. So far, many attempts have been done to extract sensory information from dorsal root ganglia and sensory nerves. In this work, we examine decoding joint angles trajectories from the single-electrode extracellular recording of dorsal horn gray matter of the spinal cord during passive limb movement in anesthetized cats. In this study, a processing framework based on ensemble learning approach is propose to combine firing rate (FR) and interspike interval (ISI) information of the neuronal activity...
January 12, 2018: Scientific Reports
https://www.readbyqxmd.com/read/29319225/de-novo-design-of-bioactive-small-molecules-by-artificial-intelligence
#4
Daniel Merk, Lukas Friedrich, Francesca Grisoni, Gisbert Schneider
Generative artificial intelligence offers a fresh view on molecular design. We present the first-time prospective application of a deep learning model for designing new druglike compounds with desired activities. For this purpose, we trained a recurrent neural network to capture the constitution of a large set of known bioactive compounds represented as SMILES strings. By transfer learning, this general model was fine-tuned on recognizing retinoid X and peroxisome proliferator-activated receptor agonists. We synthesized five top-ranking compounds designed by the generative model...
January 10, 2018: Molecular Informatics
https://www.readbyqxmd.com/read/29318865/-advances-in-acupuncture-mechanism-research-on-the-changes-of-synaptic-plasticity-pain-memory-for-chronic-pain
#5
Yi-Ling Yang, Jian-Peng Huang, Li Jiang, Jian-Hua Liu
Previous studies have shown that there are many common structures between the neural network of pain and memory, and the main structure in the pain network is also part of the memory network. Chronic pain is characterized by recurrent attacks and is associated with persistent ectopic impulse, which causes changes in synaptic structure and function based on nerve activity. These changes may induce long-term potentiation of synaptic transmission, and ultimately lead to changes in the central nervous system to produce "pain memory"...
December 25, 2017: Zhen Ci Yan Jiu, Acupuncture Research
https://www.readbyqxmd.com/read/29306756/a-loop-based-neural-architecture-for-structured-behavior-encoding-and-decoding
#6
Thomas Gisiger, Mounir Boukadoum
We present a new type of artificial neural network that generalizes on anatomical and dynamical aspects of the mammal brain. Its main novelty lies in its topological structure which is built as an array of interacting elementary motifs shaped like loops. These loops come in various types and can implement functions such as gating, inhibitory or executive control, or encoding of task elements to name a few. Each loop features two sets of neurons and a control region, linked together by non-recurrent projections...
December 8, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29301111/a-deep-learning-framework-for-causal-shape-transformation
#7
Kin Gwn Lore, Daniel Stoecklein, Michael Davies, Baskar Ganapathysubramanian, Soumik Sarkar
Recurrent neural network (RNN) and Long Short-term Memory (LSTM) networks are the common go-to architecture for exploiting sequential information where the output is dependent on a sequence of inputs. However, in most considered problems, the dependencies typically lie in the latent domain which may not be suitable for applications involving the prediction of a step-wise transformation sequence that is dependent on the previous states only in the visible domain with a known terminal state. We propose a hybrid architecture of convolution neural networks (CNN) and stacked autoencoders (SAE) to learn a sequence of causal actions that nonlinearly transform an input visual pattern or distribution into a target visual pattern or distribution with the same support and demonstrated its practicality in a real-world engineering problem involving the physics of fluids...
December 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29300698/laplacian-echo-state-network-for-multivariate-time-series-prediction
#8
Min Han, Meiling Xu
Echo state network is a novel kind of recurrent neural networks, with a trainable linear readout layer and a large fixed recurrent connected hidden layer, which can be used to map the rich dynamics of complex real-world data sets. It has been extensively studied in time series prediction. However, there may be an ill-posed problem caused by the number of real-world training samples less than the size of the hidden layer. In this brief, a Laplacian echo state network (LAESN), is proposed to overcome the ill-posed problem and obtain low-dimensional output weights...
January 2018: IEEE Transactions on Neural Networks and Learning Systems
https://www.readbyqxmd.com/read/29297322/cnn-blpred-a-convolutional-neural-network-based-predictor-for-%C3%AE-lactamases-bl-and-their-classes
#9
Clarence White, Hamid D Ismail, Hiroto Saigo, Dukka B Kc
BACKGROUND: The β-Lactamase (BL) enzyme family is an important class of enzymes that plays a key role in bacterial resistance to antibiotics. As the newly identified number of BL enzymes is increasing daily, it is imperative to develop a computational tool to classify the newly identified BL enzymes into one of its classes. There are two types of classification of BL enzymes: Molecular Classification and Functional Classification. Existing computational methods only address Molecular Classification and the performance of these existing methods is unsatisfactory...
December 28, 2017: BMC Bioinformatics
https://www.readbyqxmd.com/read/29289035/-force-learning-in-recurrent-neural-networks-as-data-assimilation
#10
Gregory S Duane
It is shown that the "FORCE" algorithm for learning in arbitrarily connected networks of simple neuronal units can be cast as a Kalman Filter, with a particular state-dependent form for the background error covariances. The resulting interpretation has implications for initialization of the learning algorithm, leads to an extension to include interactions between the weight updates for different neurons, and can represent relationships within groups of multiple target output signals.
December 2017: Chaos
https://www.readbyqxmd.com/read/29244814/forecasting-influenza-like-illness-dynamics-for-military-populations-using-neural-networks-and-social-media
#11
Svitlana Volkova, Ellyn Ayton, Katherine Porterfield, Courtney D Corley
This work is the first to take advantage of recurrent neural networks to predict influenza-like illness (ILI) dynamics from various linguistic signals extracted from social media data. Unlike other approaches that rely on timeseries analysis of historical ILI data and the state-of-the-art machine learning models, we build and evaluate the predictive power of neural network architectures based on Long Short Term Memory (LSTMs) units capable of nowcasting (predicting in "real-time") and forecasting (predicting the future) ILI dynamics in the 2011 - 2014 influenza seasons...
2017: PloS One
https://www.readbyqxmd.com/read/29240694/centralized-networks-to-generate-human-body-motions
#12
Sergei Vakulenko, Ovidiu Radulescu, Ivan Morozov, Andres Weber
We consider continuous-time recurrent neural networks as dynamical models for the simulation of human body motions. These networks consist of a few centers and many satellites connected to them. The centers evolve in time as periodical oscillators with different frequencies. The center states define the satellite neurons' states by a radial basis function (RBF) network. To simulate different motions, we adjust the parameters of the RBF networks. Our network includes a switching module that allows for turning from one motion to another...
December 14, 2017: Sensors
https://www.readbyqxmd.com/read/29236678/protein-protein-interaction-article-classification-using-a-convolutional-recurrent-neural-network-with-pre-trained-word-embeddings
#13
Sérgio Matos, Rui Antunes
Curation of protein interactions from scientific articles is an important task, since interaction networks are essential for the understanding of biological processes associated with disease or pharmacological action for example. However, the increase in the number of publications that potentially contain relevant information turns this into a very challenging and expensive task. In this work we used a convolutional recurrent neural network for identifying relevant articles for extracting information regarding protein interactions...
December 13, 2017: Journal of Integrative Bioinformatics
https://www.readbyqxmd.com/read/29232710/stabilizing-patterns-in-time-neural-network-approach
#14
Nadav Ben-Shushan, Misha Tsodyks
Recurrent and feedback networks are capable of holding dynamic memories. Nonetheless, training a network for that task is challenging. In order to do so, one should face non-linear propagation of errors in the system. Small deviations from the desired dynamics due to error or inherent noise might have a dramatic effect in the future. A method to cope with these difficulties is thus needed. In this work we focus on recurrent networks with linear activation functions and binary output unit. We characterize its ability to reproduce a temporal sequence of actions over its output unit...
December 12, 2017: PLoS Computational Biology
https://www.readbyqxmd.com/read/29223869/nonlinear-recurrent-neural-networks-for-finite-time-solution-of-general-time-varying-linear-matrix-equations
#15
Lin Xiao, Bolin Liao, Shuai Li, Ke Chen
In order to solve general time-varying linear matrix equations (LMEs) more efficiently, this paper proposes two nonlinear recurrent neural networks based on two nonlinear activation functions. According to Lyapunov theory, such two nonlinear recurrent neural networks are proved to be convergent within finite-time. Besides, by solving differential equation, the upper bounds of the finite convergence time are determined analytically. Compared with existing recurrent neural networks, the proposed two nonlinear recurrent neural networks have a better convergence property (i...
December 2, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/29218894/mri-to-mgmt-predicting-methylation-status-in-glioblastoma-patients-using-convolutional-recurrent-neural-networks
#16
Lichy Han, Maulik R Kamdar
Glioblastoma Multiforme (GBM), a malignant brain tumor, is among the most lethal of all cancers. Temozolomide is the primary chemotherapy treatment for patients diagnosed with GBM. The methylation status of the promoter or the enhancer regions of the O6-methylguanine methyltransferase (MGMT) gene may impact the efficacy and sensitivity of temozolomide, and hence may affect overall patient survival. Microscopic genetic changes may manifest as macroscopic morphological changes in the brain tumors that can be detected using magnetic resonance imaging (MRI), which can serve as noninvasive biomarkers for determining methylation of MGMT regulatory regions...
2018: Pacific Symposium on Biocomputing
https://www.readbyqxmd.com/read/29214684/pharmacotherapies-for-apnoea-of-prematurity-time-to-pause-and-consider-targeted-sex-specific-strategies
#17
Ken D O'Halloran, Fiona B McDonald
Developmental plasticity in the neural network orchestrating respiratory control is such that pre-term birth is associated with the elaboration of dysrhythmic breathing patterns characterized by periodic suppression of the central drive to breathe, evoking recurrent respiratory pauses termed central apnoeas. This article is protected by copyright. All rights reserved.
December 6, 2017: Experimental Physiology
https://www.readbyqxmd.com/read/29203897/flexible-timing-by-temporal-scaling-of-cortical-responses
#18
Jing Wang, Devika Narain, Eghbal A Hosseini, Mehrdad Jazayeri
Musicians can perform at different tempos, speakers can control the cadence of their speech, and children can flexibly vary their temporal expectations of events. To understand the neural basis of such flexibility, we recorded from the medial frontal cortex of nonhuman primates trained to produce different time intervals with different effectors. Neural responses were heterogeneous, nonlinear, and complex, and they exhibited a remarkable form of temporal invariance: firing rate profiles were temporally scaled to match the produced intervals...
December 4, 2017: Nature Neuroscience
https://www.readbyqxmd.com/read/29203867/sensory-stream-adaptation-in-chaotic-networks
#19
Adam Ponzi
Implicit expectations induced by predictable stimuli sequences affect neuronal response to upcoming stimuli at both single cell and neural population levels. Temporally regular sensory streams also phase entrain ongoing low frequency brain oscillations but how and why this happens is unknown. Here we investigate how random recurrent neural networks without plasticity respond to stimuli streams containing oddballs. We found the neuronal correlates of sensory stream adaptation emerge if networks generate chaotic oscillations which can be phase entrained by stimulus streams...
December 4, 2017: Scientific Reports
https://www.readbyqxmd.com/read/29191207/medical-subdomain-classification-of-clinical-notes-using-a-machine-learning-based-natural-language-processing-approach
#20
Wei-Hung Weng, Kavishwar B Wagholikar, Alexa T McCray, Peter Szolovits, Henry C Chueh
BACKGROUND: The medical subdomain of a clinical note, such as cardiology or neurology, is useful content-derived metadata for developing machine learning downstream applications. To classify the medical subdomain of a note accurately, we have constructed a machine learning-based natural language processing (NLP) pipeline and developed medical subdomain classifiers based on the content of the note. METHODS: We constructed the pipeline using the clinical NLP system, clinical Text Analysis and Knowledge Extraction System (cTAKES), the Unified Medical Language System (UMLS) Metathesaurus, Semantic Network, and learning algorithms to extract features from two datasets - clinical notes from Integrating Data for Analysis, Anonymization, and Sharing (iDASH) data repository (n = 431) and Massachusetts General Hospital (MGH) (n = 91,237), and built medical subdomain classifiers with different combinations of data representation methods and supervised learning algorithms...
December 1, 2017: BMC Medical Informatics and Decision Making
keyword
keyword
75726
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"