File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ICC45855.2022.9838989
- Scopus: eid_2-s2.0-85137270089
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: SemiFL: Semi-Federated Learning Empowered by Simultaneously Transmitting and Reflecting Reconfigurable Intelligent Surface
Title | SemiFL: Semi-Federated Learning Empowered by Simultaneously Transmitting and Reflecting Reconfigurable Intelligent Surface |
---|---|
Authors | |
Issue Date | 2022 |
Citation | IEEE International Conference on Communications, 2022, v. 2022-May, p. 5104-5109 How to Cite? |
Abstract | This paper proposes a novel semi-federated learning (SemiFL) paradigm, which integrates centralized learning (CL) and over-the-air federated learning (AirFL) into a unified framework, with the aid of a simultaneous transmitting and reflecting reconfigurable intelligent surface (STAR-RIS). In particular, this SemiFL framework allows computing-scarce users to participant in the learning process by using non-orthogonal multiple access (NOMA) to transmit their local dataset to the base station for model computation on behalf of them. During the uplink communication, scarce spectrum resources are shared among AirFL users and NOMA-based CL users, using a STAR-RIS for interference management and coverage enhancement. To analyze the learning behavior of SemiFL, closed-form expressions are derived to quantify the impact of learning rates and noisy fading channels. Our analysis shows that SemiFL can achieve a lower error floor than the CL or AirFL schemes with partial users. Simulation results show that SemiFL significantly reduces communication overhead and latency compared to CL, while achieving better learning performance than AirFL. |
Persistent Identifier | http://hdl.handle.net/10722/349786 |
ISSN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ni, Wanli | - |
dc.contributor.author | Liu, Yuanwei | - |
dc.contributor.author | Tian, Hui | - |
dc.contributor.author | Eldar, Yonina C. | - |
dc.contributor.author | Huang, Kaibin | - |
dc.date.accessioned | 2024-10-17T07:00:48Z | - |
dc.date.available | 2024-10-17T07:00:48Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | IEEE International Conference on Communications, 2022, v. 2022-May, p. 5104-5109 | - |
dc.identifier.issn | 1550-3607 | - |
dc.identifier.uri | http://hdl.handle.net/10722/349786 | - |
dc.description.abstract | This paper proposes a novel semi-federated learning (SemiFL) paradigm, which integrates centralized learning (CL) and over-the-air federated learning (AirFL) into a unified framework, with the aid of a simultaneous transmitting and reflecting reconfigurable intelligent surface (STAR-RIS). In particular, this SemiFL framework allows computing-scarce users to participant in the learning process by using non-orthogonal multiple access (NOMA) to transmit their local dataset to the base station for model computation on behalf of them. During the uplink communication, scarce spectrum resources are shared among AirFL users and NOMA-based CL users, using a STAR-RIS for interference management and coverage enhancement. To analyze the learning behavior of SemiFL, closed-form expressions are derived to quantify the impact of learning rates and noisy fading channels. Our analysis shows that SemiFL can achieve a lower error floor than the CL or AirFL schemes with partial users. Simulation results show that SemiFL significantly reduces communication overhead and latency compared to CL, while achieving better learning performance than AirFL. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE International Conference on Communications | - |
dc.title | SemiFL: Semi-Federated Learning Empowered by Simultaneously Transmitting and Reflecting Reconfigurable Intelligent Surface | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/ICC45855.2022.9838989 | - |
dc.identifier.scopus | eid_2-s2.0-85137270089 | - |
dc.identifier.volume | 2022-May | - |
dc.identifier.spage | 5104 | - |
dc.identifier.epage | 5109 | - |