File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Incremental Learning via Rate Reduction

TitleIncremental Learning via Rate Reduction
Authors
Issue Date2021
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2021, p. 1125-1133 How to Cite?
AbstractCurrent deep learning architectures suffer from catastrophic forgetting, a failure to retain knowledge of previously learned classes when incrementally trained on new classes. The fundamental roadblock faced by deep learning methods is that the models are optimized as “black boxes,” making it difficult to properly adjust the model parameters to preserve knowledge about previously seen data. To overcome the problem of catastrophic forgetting, we propose utilizing an alternative “white box” architecture derived from the principle of rate reduction, where each layer of the network is explicitly computed without back propagation. Under this paradigm, we demonstrate that, given a pretrained network and new data classes, our approach can provably construct a new network that emulates joint training with all past and new classes. Finally, our experiments show that our proposed learning algorithm observes significantly less decay in classification performance, outperforming state of the art methods on MNIST and CIFAR-10 by a large margin and justifying the use of “white box” algorithms for incremental learning even for sufficiently complex image data.
Persistent Identifierhttp://hdl.handle.net/10722/327776
ISSN
2023 SCImago Journal Rankings: 10.331
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWu, Ziyang-
dc.contributor.authorBaek, Christina-
dc.contributor.authorYou, Chong-
dc.contributor.authorMa, Yi-
dc.date.accessioned2023-05-08T02:26:44Z-
dc.date.available2023-05-08T02:26:44Z-
dc.date.issued2021-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2021, p. 1125-1133-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/327776-
dc.description.abstractCurrent deep learning architectures suffer from catastrophic forgetting, a failure to retain knowledge of previously learned classes when incrementally trained on new classes. The fundamental roadblock faced by deep learning methods is that the models are optimized as “black boxes,” making it difficult to properly adjust the model parameters to preserve knowledge about previously seen data. To overcome the problem of catastrophic forgetting, we propose utilizing an alternative “white box” architecture derived from the principle of rate reduction, where each layer of the network is explicitly computed without back propagation. Under this paradigm, we demonstrate that, given a pretrained network and new data classes, our approach can provably construct a new network that emulates joint training with all past and new classes. Finally, our experiments show that our proposed learning algorithm observes significantly less decay in classification performance, outperforming state of the art methods on MNIST and CIFAR-10 by a large margin and justifying the use of “white box” algorithms for incremental learning even for sufficiently complex image data.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.titleIncremental Learning via Rate Reduction-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR46437.2021.00118-
dc.identifier.scopuseid_2-s2.0-85112079145-
dc.identifier.spage1125-
dc.identifier.epage1133-
dc.identifier.isiWOS:000739917301032-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats