File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Knowledge-Aided Federated Learning for Energy-Limited Wireless Networks

TitleKnowledge-Aided Federated Learning for Energy-Limited Wireless Networks
Authors
KeywordsDevice scheduling
Lyapunov optimization
personalized federated learning
resource allocation
Issue Date2023
Citation
IEEE Transactions on Communications, 2023, v. 71, n. 6, p. 3368-3386 How to Cite?
AbstractThe conventional model aggregation-based federated learning (FL) approach requires all local models to have the same architecture, which fails to support practical scenarios with heterogeneous local models. Moreover, the frequent model exchange is costly for resource-limited wireless networks since modern deep neural networks usually have over a million parameters. To tackle these challenges, we first propose a novel knowledge-aided FL (KFL) framework, which aggregates light high-level data features, namely knowledge, in the per-round learning process. This framework allows devices to design their machine-learning models independently and reduces the communication overhead in the training process. We then theoretically analyze the convergence bound of the proposed framework under a non-convex loss function setting, revealing that scheduling more data volume in each round helps to improve the learning performance. In addition, large data volume should be scheduled in early rounds if the total scheduled data volume during the entire learning course is fixed. Inspired by this, we define a new objective function, i.e., the weighted scheduled data sample volume, to transform the inexplicit global loss minimization problem into a tractable one for device scheduling, bandwidth allocation, and power control. To deal with unknown time-varying wireless channels, we transform the considered problem into a deterministic problem for each round with the assistance of the Lyapunov optimization framework. Then, we derive the optimal bandwidth allocation and power control solution by convex optimization techniques. We also develop an efficient online device scheduling algorithm to achieve an energy-learning trade-off in the learning process. Experimental results on two typical datasets (i.e., MNIST and CIFAR-10) under highly heterogeneous local data distributions show that the proposed KFL is capable of reducing over 99% communication overhead while achieving better learning performance than the conventional model aggregation-based algorithms. In addition, the proposed device scheduling algorithm converges faster than the benchmark scheduling schemes.
Persistent Identifierhttp://hdl.handle.net/10722/349899
ISSN
2023 Impact Factor: 7.2
2020 SCImago Journal Rankings: 1.468

 

DC FieldValueLanguage
dc.contributor.authorChen, Zhixiong-
dc.contributor.authorYi, Wenqiang-
dc.contributor.authorLiu, Yuanwei-
dc.contributor.authorNallanathan, Arumugam-
dc.date.accessioned2024-10-17T07:01:43Z-
dc.date.available2024-10-17T07:01:43Z-
dc.date.issued2023-
dc.identifier.citationIEEE Transactions on Communications, 2023, v. 71, n. 6, p. 3368-3386-
dc.identifier.issn0090-6778-
dc.identifier.urihttp://hdl.handle.net/10722/349899-
dc.description.abstractThe conventional model aggregation-based federated learning (FL) approach requires all local models to have the same architecture, which fails to support practical scenarios with heterogeneous local models. Moreover, the frequent model exchange is costly for resource-limited wireless networks since modern deep neural networks usually have over a million parameters. To tackle these challenges, we first propose a novel knowledge-aided FL (KFL) framework, which aggregates light high-level data features, namely knowledge, in the per-round learning process. This framework allows devices to design their machine-learning models independently and reduces the communication overhead in the training process. We then theoretically analyze the convergence bound of the proposed framework under a non-convex loss function setting, revealing that scheduling more data volume in each round helps to improve the learning performance. In addition, large data volume should be scheduled in early rounds if the total scheduled data volume during the entire learning course is fixed. Inspired by this, we define a new objective function, i.e., the weighted scheduled data sample volume, to transform the inexplicit global loss minimization problem into a tractable one for device scheduling, bandwidth allocation, and power control. To deal with unknown time-varying wireless channels, we transform the considered problem into a deterministic problem for each round with the assistance of the Lyapunov optimization framework. Then, we derive the optimal bandwidth allocation and power control solution by convex optimization techniques. We also develop an efficient online device scheduling algorithm to achieve an energy-learning trade-off in the learning process. Experimental results on two typical datasets (i.e., MNIST and CIFAR-10) under highly heterogeneous local data distributions show that the proposed KFL is capable of reducing over 99% communication overhead while achieving better learning performance than the conventional model aggregation-based algorithms. In addition, the proposed device scheduling algorithm converges faster than the benchmark scheduling schemes.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Communications-
dc.subjectDevice scheduling-
dc.subjectLyapunov optimization-
dc.subjectpersonalized federated learning-
dc.subjectresource allocation-
dc.titleKnowledge-Aided Federated Learning for Energy-Limited Wireless Networks-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TCOMM.2023.3261383-
dc.identifier.scopuseid_2-s2.0-85153377806-
dc.identifier.volume71-
dc.identifier.issue6-
dc.identifier.spage3368-
dc.identifier.epage3386-
dc.identifier.eissn1558-0857-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats