Attention Turbo-Autoencoder for Improved Channel Coding and Reconstruction

Kayode A. Olaniyi, R. Heymann, Theo G. Swart

Research output: Contribution to journalArticlepeer-review

Abstract

Channel coding and signal reconstruction are critical tasks in communication systems to ensure reliable and efficient data transmission. Traditional approaches, such as turbo codes and Low-Density Parity-Check (LDPC) codes, have been widely used for these tasks. However, these methods may suffer from limitations in accurately capturing the complex channel characteristics and effectively handling noisy environments. To address these challenges, this research proposes deep learning-based channel codes that integrate an attention mechanism into turbo code autoencoders, referred to as ATT-TurboAE, to enhance channel coding and reconstruction performance. The attention mechanism selectively focuses on informative features while suppressing noise and interference, improving the accuracy and robustness of the system. The proposed approach is evaluated using simulated datasets and compared with traditional turbo autoencoders. The results demonstrate that the attention mechanism significantly improves the performance of channel coding and signal reconstruction, achieving higher accuracy and better noise resilience. This research contributes to the advancement of communication systems by introducing a novel technique that enhances turbo autoencoders through the incorporation of attention mechanisms, leading to improved channel coding and reconstruction performance.

Original languageEnglish
Pages (from-to)229-241
Number of pages13
JournalJournal of Communications
Volume19
Issue number5
DOIs
Publication statusPublished - 2024

Keywords

  • TurboAE
  • attention mechanism
  • autoencoder
  • channel coding
  • channel matrix
  • encoder-decoder

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Attention Turbo-Autoencoder for Improved Channel Coding and Reconstruction'. Together they form a unique fingerprint.

Cite this