File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Feedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes

TitleFeedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes
Authors
Keywordsactive feedback
channel coding
deep learning
feedback
self-attention
transformer
Issue Date2023
Citation
IEEE International Conference on Communications, 2023, v. 2023-May, p. 6652-6657 How to Cite?
AbstractDeep neural network (DNN)-assisted channel coding designs, such as low-complexity neural decoders for existing codes, or end-to-end neural-network-based auto-encoder designs are gaining interest recently due to their improved performance and flexibility; particularly for communication scenarios in which high-performing structured code designs do not exist. Communication in the presence of feedback is one such communication scenario, and practical code design for feedback channels has remained an open challenge in coding theory for many decades. Recently, DNN-based designs have shown impressive results in exploiting feedback. In particular, generalized block attention feedback (GBAF) codes, which utilizes the popular transformer architecture, achieved significant improvement in terms of the block error rate (BLER) performance. However, previous works have focused mainly on passive feedback, where the transmitter observes a noisy version of the signal at the receiver. In this work, we show that GBAF codes can also be used for channels with active feedback. We implement a pair of transformer architectures, at the transmitter and the receiver, which interact with each other sequentially, and achieve a new state-of-the-art BLER performance, especially in the low SNR regime.
Persistent Identifierhttp://hdl.handle.net/10722/363579
ISSN

 

DC FieldValueLanguage
dc.contributor.authorOzfatura, Emre-
dc.contributor.authorShao, Yulin-
dc.contributor.authorGhazanfari, Amin-
dc.contributor.authorPerotti, Alberto-
dc.contributor.authorPopovic, Branislav-
dc.contributor.authorGunduz, Deniz-
dc.date.accessioned2025-10-10T07:47:57Z-
dc.date.available2025-10-10T07:47:57Z-
dc.date.issued2023-
dc.identifier.citationIEEE International Conference on Communications, 2023, v. 2023-May, p. 6652-6657-
dc.identifier.issn1550-3607-
dc.identifier.urihttp://hdl.handle.net/10722/363579-
dc.description.abstractDeep neural network (DNN)-assisted channel coding designs, such as low-complexity neural decoders for existing codes, or end-to-end neural-network-based auto-encoder designs are gaining interest recently due to their improved performance and flexibility; particularly for communication scenarios in which high-performing structured code designs do not exist. Communication in the presence of feedback is one such communication scenario, and practical code design for feedback channels has remained an open challenge in coding theory for many decades. Recently, DNN-based designs have shown impressive results in exploiting feedback. In particular, generalized block attention feedback (GBAF) codes, which utilizes the popular transformer architecture, achieved significant improvement in terms of the block error rate (BLER) performance. However, previous works have focused mainly on passive feedback, where the transmitter observes a noisy version of the signal at the receiver. In this work, we show that GBAF codes can also be used for channels with active feedback. We implement a pair of transformer architectures, at the transmitter and the receiver, which interact with each other sequentially, and achieve a new state-of-the-art BLER performance, especially in the low SNR regime.-
dc.languageeng-
dc.relation.ispartofIEEE International Conference on Communications-
dc.subjectactive feedback-
dc.subjectchannel coding-
dc.subjectdeep learning-
dc.subjectfeedback-
dc.subjectself-attention-
dc.subjecttransformer-
dc.titleFeedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ICC45041.2023.10278839-
dc.identifier.scopuseid_2-s2.0-85177669862-
dc.identifier.volume2023-May-
dc.identifier.spage6652-
dc.identifier.epage6657-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats