# The art of error correcting coding - Moreloz R.H.

ISBN 0471-49581-6

**Download**(direct link)

**:**

**57**> 58 59 60 61 62 63 .. 86 >> Next

In the soft-output stage of the SOVA algorithm, at stage i, the most likely information symbol Ui ~ a, a G {0,1}, is determined and the corresponding maximum metric (found in the forward pass of the VA) set equal to Mfuf}. The path metric of the best competitor, Mi(u; © 1), can be computed as [Vue],

Mfvi ©1) = mm + BM-bl)(ui © 1) + Mb(S\k'-2))} , (7.15)

where k\, &2 G {0,1,2, • ? ?. 2m — 1},

- is the path metric of the forward survivor at time i — 1 and state S^kl\

- BM\bl\ui © 1) is the branch metric at time i for the complement information associated with a transition from state ,S,(*1 * to Slk’2 '!, and

- Mb{S\k^) is the backward survivor path metric at time i and state S(k'2).

Finally, the soft output is computed as

A(uj) = Mj(l) — Mj(0). (7.16)

Example 88 Let C be a zero-tail (12,4) code obtained from a memory-2 rate-1/2

convolutional code with generators (7,5). The basic structure of the trellis diagram of this

136

THE ART OF ERROR CORRECTING CODING

Transmitted -1,-1 +1,-1 +1,-1 +1.+ I -1,+ 1 -1,-1

Received -4,-1 -1,-3 +2,-3 +3,+3 -3,+3 +3,+ l

0 -5 -9 +4 +10 +10 +24

Figure 77 Trellis diagram used in SOVA decoding for Example 88.

code is the same as in Example 86. Suppose that the information sequence (including the tail bits) is u = (110100), and that the received sequence, after binary transmission over an AWGN channel, is

f = (-4, -1, -1, -3, +2, -3, +3, +3. -3, +3, -3, +1).

Figure 77 shows the trellis diagram for this code. For i = 0,1, • • ?, 6, each state has a label on top of it of the form

)'

The branches are labeled with the branch metrics BMt. The soft outputs A (it;), for i = 1, 2, • • ?, 6 are —34, —16, +22, —16, +24, +24, respectively.

Implementation issues

In a SOVA decoder, the Viterbi algorithm needs to be executed twice. The forward processing is just as in conventional VD, with the exception that path metrics at each decoding stage need to be stored. The backward processing uses the VA, but does not need to store the surviving paths, only the metrics at each decoding state. Note that both backward processing and soft-decision computation can be done simultaneously. The backward processing stage does not need to store path survivors. In addition, the soft outputs need to be computed, after both the forward and backward recursions finish. Particular attention should be paid to the normalization of metrics at each decoding stage, in both directions. Other implementation issues are the same as in a Viterbi decoder, discussed in Sections 5.4.3 and 5.5.1.

The SOVA decoder can also be implemented as a sliding window decoder, like the conventional Viterbi decoder. By increasing the computation time, the decoder operates continuously, not on a block-by-block basis, without forcing the state of the encoder to return to the all-zero state periodically. The idea is the same as that used in the VD with traceback memory, as discussed in Section 5.4.3, where forward recursion, traceback and

SOFT-DECISION DECODING

137

backward recursion, and soft-output computations are implemented in several memory blocks (see also [Vit3]).

7.8.2 Maximum-a-posteriori (MAP) algorithm

The BCJR algorithm [BCJR] is an optimal symbol-by-symbol MAP decoding method for linear block codes that minimizes the probability of a symbol error. The goal of this MAP decoder is to examine the received sequence f and to compute the a-posteriori probabilities of the input information bits, as in (7.12). The MAP algorithm is described next, following closely the arguments in [BCJR].

The state transitions (or branches) in the trellis have probabilities

where x = ±1, and x, = m(vi) = (—l)t’i, 0 < i < N.

The sequence x is transmitted over an AWGN channel and received as a sequence f, with transition probabilities

(7.17)

and for the output symbols Vi,

(7.18)

N

N n— 1

(7.19)

wherep(rij\xij) is given by (7.1).

Let be the set of branches connecting state S.-1 to state .S'- ™* such that the associated information bit ur = j, with j ˆ {0,1}. Then

(m',ro)6i?9'

The value of <7i{m', m) in (7.20) is equal to

Oi(m',m) = m) ? Pi(m),

where the joint probability Qj(m) = Pr | .S|Tn), rp I is given recursively by

(7.21)

(7.22)

and is referred to as the forward metric.

The conditional probability (m',m) = Pr , r \11 is given by

7= '^2pi(m\m')Pr ^Xi = 5^m) j -Pr^lx} (7.23)

X

138

THE ART OF ERROR CORRECTING CODING

where pi(m\m') = Pr js'*m,jS\Z\1j, which for the AWGN channel can be put in the form

7= Pr {ui = j} ? Sij(m,m') ? exp

V ivn

q='

<7-24>

where 8ij(m,m’) = 1 if {m',m} ˆ and 6ij{m,m') = 0 otherwise. 7j^(m',m) is referred to as the branch metric.

The conditional probability /3j(m) = Pr jr/ |S*m* j is given by

1

Piim) = '^Pi+iim') (7.25)

m' jf=0

and referred to as the backward metric.

Combining (7.25), (7.24), (7.22) , (7.21) and (7.12), the soft output (LLR) of information bit Ui is given by

fYl ai~1} (m1, m)0i(m) ^

m m'

A(b,)=loe 1=log

^2^2ai-i(m'hi0) (m1 ,m)Pi(m)

V m m' /

(7.26)

with the hard-decision output is given by Ui = sgnfAf u;)) and the reliability of the bit Ui is |A(uj)|. The above equations can be interpreted as follows. A bi-directional Viterbi-like algorithm can be applied, just as in SOVA decoding (previous section). In the forward recursion, given the probability of a state transition at time i, the joint probability of the received sequence up to time i and the state at time i is evaluated. In the backward recursion, the probability of the received sequence from time i + 1 to time N, given the state at time i is computed. Then the soft output depends on the joint probability of the state transition and the received symbol at time i.

**57**> 58 59 60 61 62 63 .. 86 >> Next