Books
in black and white
Main menu
Home About us Share a book
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

The art of error correcting coding - Moreloz R.H.

Moreloz R.H. The art of error correcting coding - Wiley publishing , 2002. - 232 p.
ISBN 0471-49581-6
Download (direct link): artoferrorcorrecting2002.pdf
Previous << 1 .. 50 51 52 53 54 55 < 56 > 57 58 59 60 61 62 .. 86 >> Next

Let x represent a BPSK modulated codeword, x = m(v), where v G C and Xj = (— 1)Vi, for 1 < i < N. See also (7.2). Let Se = {i : sgn(xj) f sgn(r; )} be the set of error positions, U = {Ij,j = 1,2, • • •, d} the set of least reliable positions, and T = {i : sgn(x,) = sgn(r-j), i ˆ U). Then the extended distance or correlation discrepancy between a codeword v and a received word f is defined as [TP],
de(v,f) = ^2 \Vi\- (7-9)
iˆS.
Improved criteria for finding an optimum codeword are based on upper bounds on (7.9) and by increasing the cardinality of the sets of positions tested. Two improvements to Forney’s conditions are:
• Taipale-Pursley [TP] There exists an optimal codeword xopt such that
de{xopt,r) < ^2 N-
ter
(7.10)
134
THE ART OF ERROR CORRECTING CODING
• Kasami et al. [KKTFL] There exists an optimal codeword ,x0pt such that
de{xopt,r) < ^ M,
i?TK
(7.11)
where Tk = {i ? sgn(xj) ^ sgn(rj), i ? 17}.
Good references to GMD decoding, its extensions, and combinations with Chase algorithms are [KNIH],[Kam],[TKK],[FL4] and [TLFL],
7.7 List decoding
List decoding was introduced by Elias and Wozencraft (see [Eli3]). Most recently, list decoding of polynomial codes has received considerable attention, mainly caused by the papers written by Sudan and colleagues [Sud, Gur] on decoding RS codes beyond their error correcting capabilities. The techniques used, referred to as Sudan algorithms, use interpolation and factorization of bi-variate polynomials over extension fields. Sudan algorithms can be considered extensions of the Welch-Berlekamp algorithm [Ber2]. These techniques have been applied to soft-decision decoding of RS codes in [KV],
7.8 Soft-output algorithms
The previous sections of this chapter have been devoted to decoding algorithms that output the most likely coded sequence or codeword (or list of codewords). However, since the appearance of the revolutionary paper on turbo codes in 1993 [BGT], there is a need for decoding algorithms that output not only the most likely codeword (or list of codewords), but also an estimate of the bit reliabilities for further processing. In the field of error correcting codes, soft-output algorithms were introduced as early as 1962, when Gallager [Gal] published his work on low-density parity-check (LDPC) codes3, and later by Bahl et al. [BCJR]. In both cases, the algorithms perform a forward-backward recursion to compute the reliabilities of the code symbols. In the next section, basic soft-output decoding algorithms are described. Programs to simulate these decoding algorithms can be found on the ECC web site.
In the following sections, and for simplicity of exposition, it is assumed that a linear block code, constructed by terminating a binary memory-m rate-1 /n convolutional code, is employed for binary transmission over an AWGN channel. It is also assumed that the convolutional encoder starts at the all-zero state 5q0) and, after N trellis stages, ends at the all-zero state ,5'i?1.
7.8.1 Soft-output Viterbi algorithm
In 1989, the Viterbi algorithm was modified to output bit reliability information [HH]. The soft-output viterbi algorithm (SOVA) computes the reliability, or soft-output, of the information bits as a log-likelihood ratio (LLR),
(7.12)
3 LDPC codes are covered in Chapter 8.
SOFT-DECISION DECODING
135
where f denotes the received sequence.
The operation of a SOVA decoder can be divided into two parts. In the first part, decoding proceeds as with the conventional VA, selecting the most likely coded sequence, v, in correspondence with the path in the trellis with the maximum (correlation) metric, at stage n. (See Section 5.4.) In addition, the path metrics need to be stored at each decoding stage, and for each state. These metrics are needed in the last part of the algorithm, to compute the soft outputs. In the second part of SOVA decoding, the Viterbi algorithm transverses the trellis backwards, and computes metrics and paths, starting at i = N and ending at i = 0. It should be noted that, in this stage of the SOVA algorithm, there is no need to store the surviving paths, but only the metrics for each trellis state. Finally for each trellis stage i, 1 < i < N, the soft outputs are computed.
Let Mmax denote the (correlation) metric of the most likely sequence v found by the Viterbi algorithm. The probability of the associated information sequence u given the received sequence, or a-posteriori probability (APP), is proportional to Mmax, since
Pr{u|r} = Pr{D|f} ~ eMm". (7.13)
Without loss of generality, the APP of information bit m can be written as
Pr{ui = 1 |r} ~ eM' W;
where Mfl) = Mmax. Let Mi (0) denote the maximum metric of paths associated with the complement of information symbol ut. Then it is easy to show that
A(ui)~Mi(l)-Mi(0). (7.14)
Therefore, at time i, the soft output can be obtained from the difference between the maximum metric of paths in the trellis with u, = 1 and the maximum metric of paths with u, = 0.
Previous << 1 .. 50 51 52 53 54 55 < 56 > 57 58 59 60 61 62 .. 86 >> Next