File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)

Article: On memory-augmented gated recurrent unit network

TitleOn memory-augmented gated recurrent unit network
Authors
KeywordsLong memory effect
Long memory network process
Memory-augmented GRU
Sentiment analysis
Volatility forecasting
Issue Date1-Apr-2025
PublisherElsevier
Citation
International Journal of Forecasting, 2025, v. 41, n. 2, p. 844-858 How to Cite?
AbstractThis paper addresses the challenge of forecasting multivariate long-memory time series. While statistical models such as the autoregressive fractionally integrated moving average (ARFIMA) and hyperbolic generalized autoregressive conditional heteroscedasticity (HYGARCH) can capture long-memory effects in time series data, they are often limited by dimensionality and parametric specification. Alternatively, recurrent neural networks (RNNs) are popular tools for approximating complex structures in sequential data. However, the lack of long-memory effect of these networks has been justified from a statistical perspective. In this paper, we propose a new network process called the memory-augmented gated recurrent unit (MGRU), which incorporates a fractionally integrated filter into the original GRU structure. We investigate the long-memory effect of the MGRU process, and demonstrate its effectiveness at capturing long-range dependence in real applications. Our findings illustrate that the proposed MGRU network outperforms existing models, indicating its potential as a promising tool for long-memory time series forecasting.
Persistent Identifierhttp://hdl.handle.net/10722/355991
ISSN
2023 Impact Factor: 6.9
2023 SCImago Journal Rankings: 2.691
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYang, Maolin-
dc.contributor.authorLi, Muyi-
dc.contributor.authorLi, Guodong-
dc.date.accessioned2025-05-20T00:35:11Z-
dc.date.available2025-05-20T00:35:11Z-
dc.date.issued2025-04-01-
dc.identifier.citationInternational Journal of Forecasting, 2025, v. 41, n. 2, p. 844-858-
dc.identifier.issn0169-2070-
dc.identifier.urihttp://hdl.handle.net/10722/355991-
dc.description.abstractThis paper addresses the challenge of forecasting multivariate long-memory time series. While statistical models such as the autoregressive fractionally integrated moving average (ARFIMA) and hyperbolic generalized autoregressive conditional heteroscedasticity (HYGARCH) can capture long-memory effects in time series data, they are often limited by dimensionality and parametric specification. Alternatively, recurrent neural networks (RNNs) are popular tools for approximating complex structures in sequential data. However, the lack of long-memory effect of these networks has been justified from a statistical perspective. In this paper, we propose a new network process called the memory-augmented gated recurrent unit (MGRU), which incorporates a fractionally integrated filter into the original GRU structure. We investigate the long-memory effect of the MGRU process, and demonstrate its effectiveness at capturing long-range dependence in real applications. Our findings illustrate that the proposed MGRU network outperforms existing models, indicating its potential as a promising tool for long-memory time series forecasting.-
dc.languageeng-
dc.publisherElsevier-
dc.relation.ispartofInternational Journal of Forecasting-
dc.subjectLong memory effect-
dc.subjectLong memory network process-
dc.subjectMemory-augmented GRU-
dc.subjectSentiment analysis-
dc.subjectVolatility forecasting-
dc.titleOn memory-augmented gated recurrent unit network-
dc.typeArticle-
dc.identifier.doi10.1016/j.ijforecast.2024.07.008-
dc.identifier.scopuseid_2-s2.0-85202854377-
dc.identifier.volume41-
dc.identifier.issue2-
dc.identifier.spage844-
dc.identifier.epage858-
dc.identifier.eissn1872-8200-
dc.identifier.isiWOS:001444629900001-
dc.identifier.issnl0169-2070-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats