File Download
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Learning deep architectures via generalized whitened neural networks
Title | Learning deep architectures via generalized whitened neural networks |
---|---|
Authors | |
Issue Date | 2017 |
Publisher | International Machine Learning Society. The conference proceedings' web site is located at http://proceedings.mlr.press/v70/ |
Citation | 34th International Conference on Machine Learning, ICML 2017, 2017, v. 5, p. 3500-3508 How to Cite? |
Abstract | © 2017 by the author(s). Whitened Neural Network (WNN) is a recent advanced deep architecture, which improves convergence and generalization of canonical neural networks by whitening their internal hidden representation. However, the whitening transformation increases computation time. Unlike WNN that reduced runtime by performing whitening every thousand iterations, which degenerates convergence due to the ill conditioning, we present generalized WNN (GWNN), which has three appealing properties. First, GWNN is able to learn compact representation to reduce computations. Second, it enables whitening transformation to be performed in a short period, preserving good conditioning. Third, we propose a data-independent estimation of the covariance matrix to further improve computational efficiency. Extensive experiments on various datasets demonstrate the benefits of GWNN. |
Persistent Identifier | http://hdl.handle.net/10722/273626 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Luo, Ping | - |
dc.date.accessioned | 2019-08-12T09:56:11Z | - |
dc.date.available | 2019-08-12T09:56:11Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | 34th International Conference on Machine Learning, ICML 2017, 2017, v. 5, p. 3500-3508 | - |
dc.identifier.uri | http://hdl.handle.net/10722/273626 | - |
dc.description.abstract | © 2017 by the author(s). Whitened Neural Network (WNN) is a recent advanced deep architecture, which improves convergence and generalization of canonical neural networks by whitening their internal hidden representation. However, the whitening transformation increases computation time. Unlike WNN that reduced runtime by performing whitening every thousand iterations, which degenerates convergence due to the ill conditioning, we present generalized WNN (GWNN), which has three appealing properties. First, GWNN is able to learn compact representation to reduce computations. Second, it enables whitening transformation to be performed in a short period, preserving good conditioning. Third, we propose a data-independent estimation of the covariance matrix to further improve computational efficiency. Extensive experiments on various datasets demonstrate the benefits of GWNN. | - |
dc.language | eng | - |
dc.publisher | International Machine Learning Society. The conference proceedings' web site is located at http://proceedings.mlr.press/v70/ | - |
dc.relation.ispartof | 34th International Conference on Machine Learning, ICML 2017 | - |
dc.title | Learning deep architectures via generalized whitened neural networks | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85048479332 | - |
dc.identifier.volume | 5 | - |
dc.identifier.spage | 3500 | - |
dc.identifier.epage | 3508 | - |