Title: Efficient concatenated coding schemes for error floor reduction of LDPC and turbo product codes
Abstract:This work introduces a novel serial code concatenation (SCC) scheme to combat the error floor problem experienced in iterated sparse graph-based error correcting codes such as turbo product (TP) codes...This work introduces a novel serial code concatenation (SCC) scheme to combat the error floor problem experienced in iterated sparse graph-based error correcting codes such as turbo product (TP) codes and low density parity check (LDPC) codes. SCC has been widely used in the past to reduce the error floor in iterative decoders. However, the main stumbling block for its practical application in high speed communication systems has been the need for long and complex outer codes. The use of short outer block codes with interleaving has been shown to provide a good tradeoff between complexity and performance. Nevertheless, its application to next-generation ultra high-speed communication systems is still a major challenge as a result of the careful design of long complex interleavers needed to meet the requirements of these applications (e.g., a net coding gain >10 dB at a bit error rate of 10 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">-15</sup> with an overhead of ~ 20% for 100 Gb/s optical transport networks [1]). In this paper we present a new SCC scheme built from short outer block codes. Unlike previous proposals, the long interleaver is replaced by a simple block code combined with a novel encoding/decoding strategy. Based on this finding, we show that complexity and latency can be drastically reduced with negligible penalty. The SCC technique introduced here provides a new general framework for solving the error floor problem induced by low-weight error patterns of any coding scheme.Read More
Publication Year: 2012
Publication Date: 2012-12-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 4
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot