File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Scalable Federated Unlearning via Isolated and Coded Sharding

TitleScalable Federated Unlearning via Isolated and Coded Sharding
Authors
Issue Date2024
Citation
IJCAI International Joint Conference on Artificial Intelligence, 2024, p. 4551-4559 How to Cite?
AbstractFederated unlearning has emerged as a promising paradigm to erase the client-level data effect without affecting the performance of collaborative learning models. However, the federated unlearning process often introduces extensive storage overhead and consumes substantial computational resources, thus hindering its implementation in practice. To address this issue, this paper proposes a scalable federated unlearning framework based on isolated sharding and coded computing. We first divide distributed clients into multiple isolated shards across stages to reduce the number of clients being affected. Then, to reduce the storage overhead of the central server, we develop a coded computing mechanism by compressing the model parameters across different shards. In addition, we provide the theoretical analysis of time efficiency and storage effectiveness for the isolated and coded sharding. Finally, extensive experiments on two typical learning tasks, i.e., classification and generation, demonstrate that our proposed framework can achieve better performance than three state-of-the-art frameworks in terms of accuracy, retraining time, storage overhead, and F1 scores for resisting membership inference attacks.
Persistent Identifierhttp://hdl.handle.net/10722/353217
ISSN
2020 SCImago Journal Rankings: 0.649

 

DC FieldValueLanguage
dc.contributor.authorLin, Yijing-
dc.contributor.authorGao, Zhipeng-
dc.contributor.authorDu, Hongyang-
dc.contributor.authorNiyato, Dusit-
dc.contributor.authorGui, Gui-
dc.contributor.authorCui, Shuguang-
dc.contributor.authorRen, Jinke-
dc.date.accessioned2025-01-13T03:02:41Z-
dc.date.available2025-01-13T03:02:41Z-
dc.date.issued2024-
dc.identifier.citationIJCAI International Joint Conference on Artificial Intelligence, 2024, p. 4551-4559-
dc.identifier.issn1045-0823-
dc.identifier.urihttp://hdl.handle.net/10722/353217-
dc.description.abstractFederated unlearning has emerged as a promising paradigm to erase the client-level data effect without affecting the performance of collaborative learning models. However, the federated unlearning process often introduces extensive storage overhead and consumes substantial computational resources, thus hindering its implementation in practice. To address this issue, this paper proposes a scalable federated unlearning framework based on isolated sharding and coded computing. We first divide distributed clients into multiple isolated shards across stages to reduce the number of clients being affected. Then, to reduce the storage overhead of the central server, we develop a coded computing mechanism by compressing the model parameters across different shards. In addition, we provide the theoretical analysis of time efficiency and storage effectiveness for the isolated and coded sharding. Finally, extensive experiments on two typical learning tasks, i.e., classification and generation, demonstrate that our proposed framework can achieve better performance than three state-of-the-art frameworks in terms of accuracy, retraining time, storage overhead, and F1 scores for resisting membership inference attacks.-
dc.languageeng-
dc.relation.ispartofIJCAI International Joint Conference on Artificial Intelligence-
dc.titleScalable Federated Unlearning via Isolated and Coded Sharding-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85204304292-
dc.identifier.spage4551-
dc.identifier.epage4559-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats