File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Compact Autoregressive Network

TitleCompact Autoregressive Network
Authors
Issue Date2020
PublisherAAAI Press. The Journal's web site is located at https://aaai.org/Library/AAAI/aaai-library.php
Citation
Proceedings of the 34th AAAI Conference on Artificial Intelligence, New York, NY, USA, 7-12 February 2020, v. 34 n. 4, p. 6145-6152 How to Cite?
AbstractAutoregressive networks can achieve promising performance in many sequence modeling tasks with short-range dependence. However, when handling high-dimensional inputs and outputs, the massive amount of parameters in the network leads to expensive computational cost and low learning efficiency. The problem can be alleviated slightly by introducing one more narrow hidden layer to the network, but the sample size required to achieve a certain training error is still substantial. To address this challenge, we rearrange the weight matrices of a linear autoregressive network into a tensor form, and then make use of Tucker decomposition to represent low-rank structures. This leads to a novel compact autoregressive network, called Tucker AutoRegressive (TAR) net. Interestingly, the TAR net can be applied to sequences with long-range dependence since the dimension along the sequential order is reduced. Theoretical studies show that the TAR net improves the learning efficiency, and requires much fewer samples for model training. Experiments on synthetic and real-world datasets demonstrate the promising performance of the proposed compact network.
DescriptionAAAI Technical Track 4: Machine Learning
Persistent Identifierhttp://hdl.handle.net/10722/286649
ISSN

 

DC FieldValueLanguage
dc.contributor.authorWang, D-
dc.contributor.authorHuang, F-
dc.contributor.authorZhao, J-
dc.contributor.authorLi, G-
dc.contributor.authorTian, G-
dc.date.accessioned2020-09-04T13:28:32Z-
dc.date.available2020-09-04T13:28:32Z-
dc.date.issued2020-
dc.identifier.citationProceedings of the 34th AAAI Conference on Artificial Intelligence, New York, NY, USA, 7-12 February 2020, v. 34 n. 4, p. 6145-6152-
dc.identifier.issn2159-5399-
dc.identifier.urihttp://hdl.handle.net/10722/286649-
dc.descriptionAAAI Technical Track 4: Machine Learning-
dc.description.abstractAutoregressive networks can achieve promising performance in many sequence modeling tasks with short-range dependence. However, when handling high-dimensional inputs and outputs, the massive amount of parameters in the network leads to expensive computational cost and low learning efficiency. The problem can be alleviated slightly by introducing one more narrow hidden layer to the network, but the sample size required to achieve a certain training error is still substantial. To address this challenge, we rearrange the weight matrices of a linear autoregressive network into a tensor form, and then make use of Tucker decomposition to represent low-rank structures. This leads to a novel compact autoregressive network, called Tucker AutoRegressive (TAR) net. Interestingly, the TAR net can be applied to sequences with long-range dependence since the dimension along the sequential order is reduced. Theoretical studies show that the TAR net improves the learning efficiency, and requires much fewer samples for model training. Experiments on synthetic and real-world datasets demonstrate the promising performance of the proposed compact network.-
dc.languageeng-
dc.publisherAAAI Press. The Journal's web site is located at https://aaai.org/Library/AAAI/aaai-library.php-
dc.relation.ispartofProceedings of the AAAI Conference on Artificial Intelligence-
dc.titleCompact Autoregressive Network-
dc.typeConference_Paper-
dc.identifier.emailLi, G: gdli@hku.hk-
dc.identifier.authorityLi, G=rp00738-
dc.identifier.doi10.1609/aaai.v34i04.6079-
dc.identifier.hkuros313960-
dc.identifier.volume34-
dc.identifier.issue4-
dc.identifier.spage6145-
dc.identifier.epage6152-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats