File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ICC45041.2023.10278839
- Scopus: eid_2-s2.0-85177669862
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Feedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes
| Title | Feedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes |
|---|---|
| Authors | |
| Keywords | active feedback channel coding deep learning feedback self-attention transformer |
| Issue Date | 2023 |
| Citation | IEEE International Conference on Communications, 2023, v. 2023-May, p. 6652-6657 How to Cite? |
| Abstract | Deep neural network (DNN)-assisted channel coding designs, such as low-complexity neural decoders for existing codes, or end-to-end neural-network-based auto-encoder designs are gaining interest recently due to their improved performance and flexibility; particularly for communication scenarios in which high-performing structured code designs do not exist. Communication in the presence of feedback is one such communication scenario, and practical code design for feedback channels has remained an open challenge in coding theory for many decades. Recently, DNN-based designs have shown impressive results in exploiting feedback. In particular, generalized block attention feedback (GBAF) codes, which utilizes the popular transformer architecture, achieved significant improvement in terms of the block error rate (BLER) performance. However, previous works have focused mainly on passive feedback, where the transmitter observes a noisy version of the signal at the receiver. In this work, we show that GBAF codes can also be used for channels with active feedback. We implement a pair of transformer architectures, at the transmitter and the receiver, which interact with each other sequentially, and achieve a new state-of-the-art BLER performance, especially in the low SNR regime. |
| Persistent Identifier | http://hdl.handle.net/10722/363579 |
| ISSN |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ozfatura, Emre | - |
| dc.contributor.author | Shao, Yulin | - |
| dc.contributor.author | Ghazanfari, Amin | - |
| dc.contributor.author | Perotti, Alberto | - |
| dc.contributor.author | Popovic, Branislav | - |
| dc.contributor.author | Gunduz, Deniz | - |
| dc.date.accessioned | 2025-10-10T07:47:57Z | - |
| dc.date.available | 2025-10-10T07:47:57Z | - |
| dc.date.issued | 2023 | - |
| dc.identifier.citation | IEEE International Conference on Communications, 2023, v. 2023-May, p. 6652-6657 | - |
| dc.identifier.issn | 1550-3607 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/363579 | - |
| dc.description.abstract | Deep neural network (DNN)-assisted channel coding designs, such as low-complexity neural decoders for existing codes, or end-to-end neural-network-based auto-encoder designs are gaining interest recently due to their improved performance and flexibility; particularly for communication scenarios in which high-performing structured code designs do not exist. Communication in the presence of feedback is one such communication scenario, and practical code design for feedback channels has remained an open challenge in coding theory for many decades. Recently, DNN-based designs have shown impressive results in exploiting feedback. In particular, generalized block attention feedback (GBAF) codes, which utilizes the popular transformer architecture, achieved significant improvement in terms of the block error rate (BLER) performance. However, previous works have focused mainly on passive feedback, where the transmitter observes a noisy version of the signal at the receiver. In this work, we show that GBAF codes can also be used for channels with active feedback. We implement a pair of transformer architectures, at the transmitter and the receiver, which interact with each other sequentially, and achieve a new state-of-the-art BLER performance, especially in the low SNR regime. | - |
| dc.language | eng | - |
| dc.relation.ispartof | IEEE International Conference on Communications | - |
| dc.subject | active feedback | - |
| dc.subject | channel coding | - |
| dc.subject | deep learning | - |
| dc.subject | feedback | - |
| dc.subject | self-attention | - |
| dc.subject | transformer | - |
| dc.title | Feedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1109/ICC45041.2023.10278839 | - |
| dc.identifier.scopus | eid_2-s2.0-85177669862 | - |
| dc.identifier.volume | 2023-May | - |
| dc.identifier.spage | 6652 | - |
| dc.identifier.epage | 6657 | - |
