journal
MENU ▼
Read by QxMD icon Read
search

Neural Networks: the Official Journal of the International Neural Network Society

journal
https://www.readbyqxmd.com/read/28410513/representation-learning-via-dual-autoencoder-for-recommendation
#1
Fuzhen Zhuang, Zhiqiang Zhang, Mingda Qian, Chuan Shi, Xing Xie, Qing He
Recommendation has provoked vast amount of attention and research in recent decades. Most previous works employ matrix factorization techniques to learn the latent factors of users and items. And many subsequent works consider external information, e.g., social relationships of users and items' attributions, to improve the recommendation performance under the matrix factorization framework. However, matrix factorization methods may not make full use of the limited information from rating or check-in matrices, and achieve unsatisfying results...
March 27, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28388472/robust-fixed-time-synchronization-for-uncertain-complex-valued-neural-networks-with-discontinuous-activation-functions
#2
Xiaoshuai Ding, Jinde Cao, Ahmed Alsaedi, Fuad E Alsaadi, Tasawar Hayat
This paper is concerned with the fixed-time synchronization for a class of complex-valued neural networks in the presence of discontinuous activation functions and parameter uncertainties. Fixed-time synchronization not only claims that the considered master-slave system realizes synchronization within a finite time segment, but also requires a uniform upper bound for such time intervals for all initial synchronization errors. To accomplish the target of fixed-time synchronization, a novel feedback control procedure is designed for the slave neural networks...
March 23, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28390225/persistent-irregular-activity-is-a-result-of-rebound-and-coincident-detection-mechanisms-a-computational-study
#3
Mustafa Zeki, Ahmed A Moustafa
Persistent irregular activity is defined as elevated irregular neural discharges in the brain in such a way that while the average network activity displays high frequency oscillations, the participating neurons display irregular and low frequency oscillations. This type of activity is observed in many brain regions like prefrontal cortex that plays a role in working memory. Previous studies have shown that large networks with sparse connections, networks with strong noise and persistent inhibition and networks with structured synaptic connections display persistent-irregular activity...
March 22, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28385624/weighted-spatial-based-geometric-scheme-as-an-efficient-algorithm-for-analyzing-single-trial-eegs-to-improve-cue-based-bci-classification
#4
Fatemeh Alimardani, Reza Boostani, Benjamin Blankertz
There is a growing interest in analyzing the geometrical behavior of electroencephalogram (EEG) covariance matrix in the context of brain computer interface (BCI). The bottleneck of the current Riemannian framework is the bias of the mean vector of EEG signals to the noisy trials, which deteriorates the covariance matrix in the manifold space. This study presents a spatial weighting scheme to reduce the effect of noisy trials on the mean vector. To assess the proposed method, dataset IIa from BCI competition IV, containing the EEG trials of 9 subjects performing four mental tasks, was utilized...
March 22, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28396068/evaluating-deep-learning-architectures-for-speech-emotion-recognition
#5
Haytham M Fayek, Margaret Lech, Lawrence Cavedon
Speech Emotion Recognition (SER) can be regarded as a static or dynamic classification problem, which makes SER an excellent test bed for investigating and comparing various deep learning architectures. We describe a frame-based formulation to SER that relies on minimal speech processing and end-to-end deep learning to model intra-utterance dynamics. We use the proposed SER system to empirically explore feed-forward and recurrent neural network architectures and their variants. Experiments conducted illuminate the advantages and limitations of these architectures in paralinguistic speech recognition and emotion recognition in particular...
March 21, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28385623/how-can-a-recurrent-neurodynamic-predictive-coding-model-cope-with-fluctuation-in-temporal-patterns-robotic-experiments-on-imitative-interaction
#6
Ahmadreza Ahmadi, Jun Tani
The current paper examines how a recurrent neural network (RNN) model using a dynamic predictive coding scheme can cope with fluctuations in temporal patterns through generalization in learning. The conjecture driving this present inquiry is that a RNN model with multiple timescales (MTRNN) learns by extracting patterns of change from observed temporal patterns, developing an internal dynamic structure such that variance in initial internal states account for modulations in corresponding observed patterns. We trained a MTRNN with low-dimensional temporal patterns, and assessed performance on an imitation task employing these patterns...
March 21, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28364676/a-time-delay-neural-network-for-solving-time-dependent-shortest-path-problem
#7
Wei Huang, Chunwang Yan, Jinsong Wang, Wei Wang
This paper concerns the time-dependent shortest path problem, which is difficult to come up with global optimal solution by means of classical shortest path approaches such as Dijkstra, and pulse-coupled neural network (PCNN). In this study, we propose a time-delay neural network (TDNN) framework that comes with the globally optimal solution when solving the time-dependent shortest path problem. The underlying idea of TDNN comes from the following mechanism: the shortest path depends on the earliest auto-wave (from start node) that arrives at the destination node...
March 21, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28388471/extending-the-stabilized-supralinear-network-model-for-binocular-image-processing
#8
Ben Selby, Bryan Tripp
The visual cortex is both extensive and intricate. Computational models are needed to clarify the relationships between its local mechanisms and high-level functions. The Stabilized Supralinear Network (SSN) model was recently shown to account for many receptive field phenomena in V1, and also to predict subtle receptive field properties that were subsequently confirmed in vivo. In this study, we performed a preliminary exploration of whether the SSN is suitable for incorporation into large, functional models of the visual cortex, considering both its extensibility and computational tractability...
March 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28364677/forecasting-stochastic-neural-network-based-on-financial-empirical-mode-decomposition
#9
Jie Wang, Jun Wang
In an attempt to improve the forecasting accuracy of stock price fluctuations, a new one-step-ahead model is developed in this paper which combines empirical mode decomposition (EMD) with stochastic time strength neural network (STNN). The EMD is a processing technique introduced to extract all the oscillatory modes embedded in a series, and the STNN model is established for considering the weight of occurrence time of the historical data. The linear regression performs the predictive availability of the proposed model, and the effectiveness of EMD-STNN is revealed clearly through comparing the predicted results with the traditional models...
March 18, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28388473/collective-mutual-information-maximization-to-unify-passive-and-positive-approaches-for-improving-interpretation-and-generalization
#10
Ryotaro Kamimura
The present paper aims to propose a simple method to realize mutual information maximization for better interpretation and generalization. To train neural networks and obtain better performance, neurons should impartially consider as many input patterns as possible. Simultaneously, and especially for ease of interpretation, they should represent characteristics specific to certain input patterns as faithfully as possible. This contradiction can be solved by introducing mutual information between neurons and input patterns...
March 16, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28365399/synchronised-firing-patterns-in-a-random-network-of-adaptive-exponential-integrate-and-fire-neuron-model
#11
F S Borges, P R Protachevicz, E L Lameu, R C Bonetti, K C Iarosz, I L Caldas, M S Baptista, A M Batista
We have studied neuronal synchronisation in a random network of adaptive exponential integrate-and-fire neurons. We study how spiking or bursting synchronous behaviour appears as a function of the coupling strength and the probability of connections, by constructing parameter spaces that identify these synchronous behaviours from measurements of the inter-spike interval and the calculation of the order parameter. Moreover, we verify the robustness of synchronisation by applying an external perturbation to each neuron...
March 16, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28318904/dual-memory-neural-networks-for-modeling-cognitive-activities-of-humans-via-wearable-sensors
#12
Sang-Woo Lee, Chung-Yeon Lee, Dong-Hyun Kwak, Jung-Woo Ha, Jeonghee Kim, Byoung-Tak Zhang
Wearable devices, such as smart glasses and watches, allow for continuous recording of everyday life in a real world over an extended period of time or lifelong. This possibility helps better understand the cognitive behavior of humans in real life as well as build human-aware intelligent agents for practical purposes. However, modeling the human cognitive activity from wearable-sensor data stream is challenging because learning new information often results in loss of previously acquired information, causing a problem known as catastrophic forgetting...
February 20, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28342724/a-hypergraph-and-arithmetic-residue-based-probabilistic-neural-network-for-classification-in-intrusion-detection-systems
#13
M R Gauthama Raman, Nivethitha Somu, Kannan Kirthivasan, V S Shankar Sriram
Over the past few decades, the design of an intelligent Intrusion Detection System (IDS) remains an open challenge to the research community. Continuous efforts by the researchers have resulted in the development of several learning models based on Artificial Neural Network (ANN) to improve the performance of the IDSs. However, there exists a tradeoff with respect to the stability of ANN architecture and the detection rate for less frequent attacks. This paper presents a novel approach based on Helly property of Hypergraph and Arithmetic Residue-based Probabilistic Neural Network (HG AR-PNN) to address the classification problem in IDS...
February 17, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28433431/application-of-structured-support-vector-machine-backpropagation-to-a-convolutional-neural-network-for-human-pose-estimation
#14
Peerajak Witoonchart, Prabhas Chongstitvatana
In this study, for the first time, we show how to formulate a structured support vector machine (SSVM) as two layers in a convolutional neural network, where the top layer is a loss augmented inference layer and the bottom layer is the normal convolutional layer. We show that a deformable part model can be learned with the proposed structured SVM neural network by backpropagating the error of the deformable part model to the convolutional neural network. The forward propagation calculates the loss augmented inference and the backpropagation calculates the gradient from the loss augmented inference layer to the convolutional layer...
February 16, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28254237/prediction-of-advertisement-preference-by-fusing-eeg-response-and-sentiment-analysis
#15
Himaanshu Gauba, Pradeep Kumar, Partha Pratim Roy, Priyanka Singh, Debi Prosad Dogra, Balasubramanian Raman
This paper presents a novel approach to predict rating of video-advertisements based on a multimodal framework combining physiological analysis of the user and global sentiment-rating available on the internet. We have fused Electroencephalogram (EEG) waves of user and corresponding global textual comments of the video to understand the user's preference more precisely. In our framework, the users were asked to watch the video-advertisement and simultaneously EEG signals were recorded. Valence scores were obtained using self-report for each video...
February 16, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28318903/understanding-human-intention-by-connecting-perception-and-action-learning-in-artificial-agents
#16
Sangwook Kim, Zhibin Yu, Minho Lee
To develop an advanced human-robot interaction system, it is important to first understand how human beings learn to perceive, think, and act in an ever-changing world. In this paper, we propose an intention understanding system that uses an Object Augmented-Supervised Multiple Timescale Recurrent Neural Network (OA-SMTRNN) and demonstrate the effects of perception-action connected learning in an artificial agent, which is inspired by psychological and neurological phenomena in humans. We believe that action and perception are not isolated processes in human mental development, and argue that these psychological and neurological interactions can be replicated in a human-machine scenario...
February 11, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28236678/a-perturbative-approach-for-enhancing-the-performance-of-time-series-forecasting
#17
Paulo S G de Mattos Neto, Tiago A E Ferreira, Aranildo R Lima, Germano C Vasconcelos, George D C Cavalcanti
This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series...
February 10, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28222299/adaptive-low-rank-subspace-learning-with-online-optimization-for-robust-visual-tracking
#18
Risheng Liu, Di Wang, Yuzhuo Han, Xin Fan, Zhongxuan Luo
In recent years, sparse and low-rank models have been widely used to formulate appearance subspace for visual tracking. However, most existing methods only consider the sparsity or low-rankness of the coefficients, which is not sufficient enough for appearance subspace learning on complex video sequences. Moreover, as both the low-rank and the column sparse measures are tightly related to all the samples in the sequences, it is challenging to incrementally solve optimization problems with both nuclear norm and column sparse norm on sequentially obtained video data...
February 10, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28214692/multi-view-clustering-via-multi-manifold-regularized-non-negative-matrix-factorization
#19
Linlin Zong, Xianchao Zhang, Long Zhao, Hong Yu, Qianli Zhao
Non-negative matrix factorization based multi-view clustering algorithms have shown their competitiveness among different multi-view clustering algorithms. However, non-negative matrix factorization fails to preserve the locally geometrical structure of the data space. In this paper, we propose a multi-manifold regularized non-negative matrix factorization framework (MMNMF) which can preserve the locally geometrical structure of the manifolds for multi-view clustering. MMNMF incorporates consensus manifold and consensus coefficient matrix with multi-manifold regularization to preserve the locally geometrical structure of the multi-view data space...
February 8, 2017: Neural Networks: the Official Journal of the International Neural Network Society
https://www.readbyqxmd.com/read/28254393/global-dissipativity-of-memristor-based-neutral-type-inertial-neural-networks
#20
Zhengwen Tu, Jinde Cao, Ahmed Alsaedi, Fuad Alsaadi
The problem of global dissipativity for memristor-based inertial networks with time-varying delay of neutral type is investigated in this paper. Based on a proper variable substitution, the inertial system is transformed into a conventional system. Some sufficient criteria are established to ascertain the global dissipativity for the aforementioned inertial neural networks by employing analytical techniques and Lyapunov method. Meanwhile, the globally exponentially attractive sets and positive invariant sets are also presented here...
February 3, 2017: Neural Networks: the Official Journal of the International Neural Network Society
journal
journal
29823
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"