Title: Turbo code performance and design trade-offs
Abstract:Turbo codes were first proposed by Berrou and Glavieux (1993), and shown to have a near Shannon limit error correction capability. Since then, turbo codes have become the focus of research and study a...Turbo codes were first proposed by Berrou and Glavieux (1993), and shown to have a near Shannon limit error correction capability. Since then, turbo codes have become the focus of research and study among the coding community. Turbo codes are particularly attractive to higher data rate applications where the additional coding gain is necessary to maintain the link performance level with limited power. For instance, the Advanced EHF satellite system is a candidate for implementing turbo codes, which offer a superior performance compared to convolutional codes currently used in the Milstar system. This paper presents a survey of turbo coding designs based on existing research journals and publications. It investigates the key design parameters for each coding scheme, such as choice of component codes, memory size, interleaver size, and the number of decoding iterations. In addition, it examines the trade-offs between improvement in code performance, and the overall delay and the computational complexity of the coding algorithm. This paper also presents bit error rate (BER) performance comparisons between different turbo code designs, both in additive white Gaussian and Rayleigh fading environments.Read More
Publication Year: 2002
Publication Date: 2002-11-11
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 7
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot