File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TSP.2020.2983166
- Scopus: eid_2-s2.0-85084391890
- WOS: WOS:000531386200001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: High-Dimensional Stochastic Gradient Quantization for Communication-Efficient Edge Learning
Title | High-Dimensional Stochastic Gradient Quantization for Communication-Efficient Edge Learning |
---|---|
Authors | |
Keywords | Quantization (signal) Stochastic processes Convergence Distortion Fasteners |
Issue Date | 2020 |
Publisher | IEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=78 |
Citation | IEE Colloquium on Signal Processing, 2020, v. 68, p. 2128-2142 How to Cite? |
Abstract | Edge machine learning involves the deployment of learning algorithms at the wireless network edge so as to leverage massive mobile data for enabling intelligent applications. The mainstream edge learning approach, federated learning, has been developed based on distributed gradient descent. Based on the approach, stochastic gradients are computed at edge devices and then transmitted to an edge server for updating a global AI model. Since each stochastic gradient is typically high-dimensional, communication overhead becomes a bottleneck for edge learning. To address this issue, we propose a novel framework of hierarchical gradient quantization and study its effect on the learning performance. First, the framework features a practical hierarchical architecture for decomposing the stochastic gradient into its norm and normalized block gradients, and efficiently quantizes them using a uniform quantizer and a low-dimensional Grassmannian codebook, respectively. Subsequently, the quantized normalized block gradients are scaled and cascaded to yield the quantized normalized stochastic gradient using a socalled hinge vector, which is compressed using another low-dimensional Grassmannian quantizer designed under the criterion of minimum distortion. The other feature of the framework is a bit-allocation scheme for reducing the distortion, which divides the total quantization bits to determine the resolutions of low-dimensional quantizers. The framework is proved to guarantee model convergency by analyzing the convergence rate as a function of quantization bits. Furthermore, by simulation, our design is shown to substantially reduce the communication overhead compared with the state-of-the-art signSGD scheme, while achieving similar learning accuracies. |
Persistent Identifier | http://hdl.handle.net/10722/290905 |
ISSN | 2021 Impact Factor: 4.875 2020 SCImago Journal Rankings: 1.638 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Du, Y | - |
dc.contributor.author | Yang, S | - |
dc.contributor.author | Huang, K | - |
dc.date.accessioned | 2020-11-02T05:48:45Z | - |
dc.date.available | 2020-11-02T05:48:45Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | IEE Colloquium on Signal Processing, 2020, v. 68, p. 2128-2142 | - |
dc.identifier.issn | 1053-587X | - |
dc.identifier.uri | http://hdl.handle.net/10722/290905 | - |
dc.description.abstract | Edge machine learning involves the deployment of learning algorithms at the wireless network edge so as to leverage massive mobile data for enabling intelligent applications. The mainstream edge learning approach, federated learning, has been developed based on distributed gradient descent. Based on the approach, stochastic gradients are computed at edge devices and then transmitted to an edge server for updating a global AI model. Since each stochastic gradient is typically high-dimensional, communication overhead becomes a bottleneck for edge learning. To address this issue, we propose a novel framework of hierarchical gradient quantization and study its effect on the learning performance. First, the framework features a practical hierarchical architecture for decomposing the stochastic gradient into its norm and normalized block gradients, and efficiently quantizes them using a uniform quantizer and a low-dimensional Grassmannian codebook, respectively. Subsequently, the quantized normalized block gradients are scaled and cascaded to yield the quantized normalized stochastic gradient using a socalled hinge vector, which is compressed using another low-dimensional Grassmannian quantizer designed under the criterion of minimum distortion. The other feature of the framework is a bit-allocation scheme for reducing the distortion, which divides the total quantization bits to determine the resolutions of low-dimensional quantizers. The framework is proved to guarantee model convergency by analyzing the convergence rate as a function of quantization bits. Furthermore, by simulation, our design is shown to substantially reduce the communication overhead compared with the state-of-the-art signSGD scheme, while achieving similar learning accuracies. | - |
dc.language | eng | - |
dc.publisher | IEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=78 | - |
dc.relation.ispartof | IEE Colloquium on Signal Processing | - |
dc.rights | IEE Colloquium on Signal Processing. Copyright © IEEE. | - |
dc.rights | ©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | Quantization (signal) | - |
dc.subject | Stochastic processes | - |
dc.subject | Convergence | - |
dc.subject | Distortion | - |
dc.subject | Fasteners | - |
dc.title | High-Dimensional Stochastic Gradient Quantization for Communication-Efficient Edge Learning | - |
dc.type | Article | - |
dc.identifier.email | Huang, K: huangkb@eee.hku.hk | - |
dc.identifier.authority | Huang, K=rp01875 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TSP.2020.2983166 | - |
dc.identifier.scopus | eid_2-s2.0-85084391890 | - |
dc.identifier.hkuros | 318038 | - |
dc.identifier.volume | 68 | - |
dc.identifier.spage | 2128 | - |
dc.identifier.epage | 2142 | - |
dc.identifier.isi | WOS:000531386200001 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 1053-587X | - |