File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Trusted Clustering Based Federated Learning in Edge Networks

TitleTrusted Clustering Based Federated Learning in Edge Networks
Authors
KeywordsEdge networks
federated learning
sharding distributed ledger technique
Issue Date1-Jan-2025
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Mobile Computing, 2025, v. 24, n. 10, p. 9726-9742 How to Cite?
AbstractFederated learning (FL) is integral to advancing edge intelligence by enabling collaborative machine learning. In FL-empowered edge networks, computing nodes first train local models and then send them to an or multiple aggregation node(s) for global model collaboration. However, the trustworthiness of both local and global models in conventional FL frameworks is compromised due to inadequate model security and transparency. Distributed ledger technique (DLT) can address this issue by leveraging multi-nodes trust capabilities to support distributed consensus. However, model training and consensus performance of DLT may significantly degrade due to instability and resource constraints of edge networks. Sharding technique provides an effective approach by dividing the ledger into smaller and manageable shards. In this paper, to improve model training and consensus performance, we propose a trusted FL framework by incorporating sharding DLT into FL frameworks. We construct a theoretical model to investigate the relationship between model training performance, consensus efficiency, and capacity of edge nodes regarding storage, computing and communications. Based on the theoretical model, we propose a trusted clustering scheme to aggregate local models. Numerical results show that our proposed scheme significantly improves network throughput for transmitting models while guaranteeing model learning performance in comparison with some classical baselines.
Persistent Identifierhttp://hdl.handle.net/10722/362169
ISSN
2023 Impact Factor: 7.7
2023 SCImago Journal Rankings: 2.755

 

DC FieldValueLanguage
dc.contributor.authorLiu, Yi Jing-
dc.contributor.authorZhang, Long-
dc.contributor.authorLi, Xiaoqian-
dc.contributor.authorDu, Hongyang-
dc.contributor.authorFeng, Gang-
dc.contributor.authorQin, Shuang-
dc.contributor.authorWang, Jiacheng-
dc.date.accessioned2025-09-19T00:33:27Z-
dc.date.available2025-09-19T00:33:27Z-
dc.date.issued2025-01-01-
dc.identifier.citationIEEE Transactions on Mobile Computing, 2025, v. 24, n. 10, p. 9726-9742-
dc.identifier.issn1536-1233-
dc.identifier.urihttp://hdl.handle.net/10722/362169-
dc.description.abstractFederated learning (FL) is integral to advancing edge intelligence by enabling collaborative machine learning. In FL-empowered edge networks, computing nodes first train local models and then send them to an or multiple aggregation node(s) for global model collaboration. However, the trustworthiness of both local and global models in conventional FL frameworks is compromised due to inadequate model security and transparency. Distributed ledger technique (DLT) can address this issue by leveraging multi-nodes trust capabilities to support distributed consensus. However, model training and consensus performance of DLT may significantly degrade due to instability and resource constraints of edge networks. Sharding technique provides an effective approach by dividing the ledger into smaller and manageable shards. In this paper, to improve model training and consensus performance, we propose a trusted FL framework by incorporating sharding DLT into FL frameworks. We construct a theoretical model to investigate the relationship between model training performance, consensus efficiency, and capacity of edge nodes regarding storage, computing and communications. Based on the theoretical model, we propose a trusted clustering scheme to aggregate local models. Numerical results show that our proposed scheme significantly improves network throughput for transmitting models while guaranteeing model learning performance in comparison with some classical baselines.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Mobile Computing-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectEdge networks-
dc.subjectfederated learning-
dc.subjectsharding distributed ledger technique-
dc.titleTrusted Clustering Based Federated Learning in Edge Networks-
dc.typeArticle-
dc.identifier.doi10.1109/TMC.2025.3566492-
dc.identifier.scopuseid_2-s2.0-105004324950-
dc.identifier.volume24-
dc.identifier.issue10-
dc.identifier.spage9726-
dc.identifier.epage9742-
dc.identifier.eissn1558-0660-
dc.identifier.issnl1536-1233-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats