Research Output
A recurrent multiscale architecture for long-term memory prediction task
  In the past few years, researchers have been extensively studying the application of recurrent neural networks (RNNs) to solving tasks where detection of long term dependencies is required. This paper proposes an original architecture termed the Recurrent Multiscale Network, RMN, to deal with these kinds of problems. Its most relevant properties are concerned with maintaining conventional RNNs' capability of information storing whilst simultaneously attempting to reduce their typical drawback occurring when they are trained by gradient descent algorithms, namely the vanishing gradient effect. This is achieved through RMN which preprocesses the original signal separating information at different temporal scales through an adequate DSP tool, and handling each information level with an autonomous recurrent architecture; the final goal is achieved by a nonlinear reconstruction section. This network has shown a markedly improved generalization performance over conventional RNNs, in its application to time series prediction tasks where long range dependencies are involved.

Citation

Squartini, S., Hussain, A., & Piazza, F. (2003). A recurrent multiscale architecture for long-term memory prediction task. In 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03), (789-792). https://doi.org/10.1109/ICASSP.2003.1202485

Authors

Keywords

recurrent neural nets, neural net architecture, time series, prediction theory, discrete wavelet transforms

Monthly Views:

Available Documents