File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Rethinking Architecture Design for Tackling Data Heterogeneity in Federated Learning

TitleRethinking Architecture Design for Tackling Data Heterogeneity in Federated Learning
Authors
KeywordsPrivacy and federated learning
Issue Date2022
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2022, v. 2022-June, p. 10051-10061 How to Cite?
AbstractFederated learning is an emerging research paradigm enabling collaborative training of machine learning models among different organizations while keeping data private at each institution. Despite recent progress, there remain fundamental challenges such as the lack of convergence and the potential for catastrophic forgetting across real-world heterogeneous devices. In this paper, we demonstrate that self-attention-based architectures (e.g., Transformers) are more robust to distribution shifts and hence improve federated learning over heterogeneous data. Concretely, we conduct the first rigorous empirical investigation of different neural architectures across a range of federated algorithms, real-world benchmarks, and heterogeneous data splits. Our experiments show that simply replacing convolutional networks with Transformers can greatly reduce catastrophic forgetting of previous devices, accelerate convergence, and reach a better global model, especially when dealing with heterogeneous data. We release our code and pretrained models to encourage future exploration in robust architectures as an alternative to current research efforts on the optimization front.
Persistent Identifierhttp://hdl.handle.net/10722/325581
ISSN
2023 SCImago Journal Rankings: 10.331
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorQu, Liangqiong-
dc.contributor.authorZhou, Yuyin-
dc.contributor.authorLiang, Paul Pu-
dc.contributor.authorXia, Yingda-
dc.contributor.authorWang, Feifei-
dc.contributor.authorAdeli, Ehsan-
dc.contributor.authorFei-Fei, Li-
dc.contributor.authorRubin, Daniel-
dc.date.accessioned2023-02-27T07:34:32Z-
dc.date.available2023-02-27T07:34:32Z-
dc.date.issued2022-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2022, v. 2022-June, p. 10051-10061-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/325581-
dc.description.abstractFederated learning is an emerging research paradigm enabling collaborative training of machine learning models among different organizations while keeping data private at each institution. Despite recent progress, there remain fundamental challenges such as the lack of convergence and the potential for catastrophic forgetting across real-world heterogeneous devices. In this paper, we demonstrate that self-attention-based architectures (e.g., Transformers) are more robust to distribution shifts and hence improve federated learning over heterogeneous data. Concretely, we conduct the first rigorous empirical investigation of different neural architectures across a range of federated algorithms, real-world benchmarks, and heterogeneous data splits. Our experiments show that simply replacing convolutional networks with Transformers can greatly reduce catastrophic forgetting of previous devices, accelerate convergence, and reach a better global model, especially when dealing with heterogeneous data. We release our code and pretrained models to encourage future exploration in robust architectures as an alternative to current research efforts on the optimization front.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.subjectPrivacy and federated learning-
dc.titleRethinking Architecture Design for Tackling Data Heterogeneity in Federated Learning-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR52688.2022.00982-
dc.identifier.scopuseid_2-s2.0-85138843349-
dc.identifier.volume2022-June-
dc.identifier.spage10051-
dc.identifier.epage10061-
dc.identifier.isiWOS:000870759103013-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats