File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Learning deep architectures via generalized whitened neural networks

TitleLearning deep architectures via generalized whitened neural networks
Authors
Issue Date2017
PublisherInternational Machine Learning Society. The conference proceedings' web site is located at http://proceedings.mlr.press/v70/
Citation
34th International Conference on Machine Learning, ICML 2017, 2017, v. 5, p. 3500-3508 How to Cite?
Abstract© 2017 by the author(s). Whitened Neural Network (WNN) is a recent advanced deep architecture, which improves convergence and generalization of canonical neural networks by whitening their internal hidden representation. However, the whitening transformation increases computation time. Unlike WNN that reduced runtime by performing whitening every thousand iterations, which degenerates convergence due to the ill conditioning, we present generalized WNN (GWNN), which has three appealing properties. First, GWNN is able to learn compact representation to reduce computations. Second, it enables whitening transformation to be performed in a short period, preserving good conditioning. Third, we propose a data-independent estimation of the covariance matrix to further improve computational efficiency. Extensive experiments on various datasets demonstrate the benefits of GWNN.
Persistent Identifierhttp://hdl.handle.net/10722/273626

 

DC FieldValueLanguage
dc.contributor.authorLuo, Ping-
dc.date.accessioned2019-08-12T09:56:11Z-
dc.date.available2019-08-12T09:56:11Z-
dc.date.issued2017-
dc.identifier.citation34th International Conference on Machine Learning, ICML 2017, 2017, v. 5, p. 3500-3508-
dc.identifier.urihttp://hdl.handle.net/10722/273626-
dc.description.abstract© 2017 by the author(s). Whitened Neural Network (WNN) is a recent advanced deep architecture, which improves convergence and generalization of canonical neural networks by whitening their internal hidden representation. However, the whitening transformation increases computation time. Unlike WNN that reduced runtime by performing whitening every thousand iterations, which degenerates convergence due to the ill conditioning, we present generalized WNN (GWNN), which has three appealing properties. First, GWNN is able to learn compact representation to reduce computations. Second, it enables whitening transformation to be performed in a short period, preserving good conditioning. Third, we propose a data-independent estimation of the covariance matrix to further improve computational efficiency. Extensive experiments on various datasets demonstrate the benefits of GWNN.-
dc.languageeng-
dc.publisherInternational Machine Learning Society. The conference proceedings' web site is located at http://proceedings.mlr.press/v70/-
dc.relation.ispartof34th International Conference on Machine Learning, ICML 2017-
dc.titleLearning deep architectures via generalized whitened neural networks-
dc.typeConference_Paper-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.scopuseid_2-s2.0-85048479332-
dc.identifier.volume5-
dc.identifier.spage3500-
dc.identifier.epage3508-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats