Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

# Detection, Estimation modulation theory part 1 - Vantress H.

Vantress H. Detection, Estimation modulation theory part 1 - Wiley & sons , 2001. - 710 p.
ISBN 0-471-09517-6
Previous << 1 .. 88 89 90 91 92 93 < 94 > 95 96 97 98 99 100 .. 254 >> Next

r{t) = V E s(t) + w(t),
= w(t),
For convenience we assume that
0 < t < T:HU 0 < t < T:H0.
(4)
(5)
Ê
r(t) = l.i.m. S î < t < T.
K-.00
(6)
When Ê = Ê', there are K' coefficients in the series, rlt..., rK. which we could denote by the vector rKIn our subsequent discussion we suppress the K' subscript and denote the coefficients by r.
248 4.2 Detection and Estimation in White Gaussian Noise
Fig. 4.10 Generation of sufficient statistics.
3. In Chapter 2 we saw that if we transformed r into two independent vectors, / (the sufficient statistic) and y, as shown in Fig. 4.10, our decision could be based only on /, because the values of ó did not depend on the hypothesis. The advantage of this technique was that it reduced the dimension of the decision space to that of /. Because this is a binary problem we know that I will be one-dimensional.
Here the method is straightforward. If we choose the first orthonormal function to be s(t), the first coefficient in the decomposition is the Gaussian random variable,
T
J Ñ
j(/) w{t) dt é. Wi'.Hq,
¥ s(t)[VEs(t) + w(t)) dt = VE +
VO
The remaining r{ (³ > 1) are Gaussian random variables which can be generated by using some arbitrary orthonormal set whose members are orthogonal to s(t).
Ã³ =
Jo Ø) w(t)dt é Wi'.Ho,
¥ 4t(t)[VEs(t) + w(t)]dt = Wt'.Hi, U î
From Chapter 3 (44) we know that
?(>W) = 0; ³ Ô j.
Because Wi and are jointly Gaussian, they are statistically independent (see Property 3 on p. 184).
We see that only rx depends on which hypothesis is true. Further, all r* (/ > 1) are statistically independent of rx. Thus rx is a sufficient statistic {r1 = /). The other correspond to y. Because they will not affect the decision, there is no need to compute them.
Detection of Signals in Additive White Gaussian Noise: Simple Binary 249
Threshold
comparison
s(t)
Several equivalent receiver structures follow immediately. The structure in Fig. 4.11 is called a correlation receiver. It correlates the input r(t) with a stored replica of the signal s(t). The output is rl9 which is a sufficient statistic (>! = /) and is a Gaussian random variable. Once we have obtained ru the decision problem will be identical to the classical problem in Chapter 2 (specifically, Example 1 on pp. 27-28). We compare I to a threshold in order to make a decision.
An equivalent realization is shown in Fig. 4.12. The impulse response of the linear system is simply the signal reversed in time and shifted,
The output at time T is the desired statistic /. This receiver is called a matched filter receiver. (It was first derived by North [20].) The two structures are mathematically identical; the choice of which structure to use depends solely on ease of realization.
Just as in Example 1 of Chapter 2, the sufficient statistic / is Gaussian under either hypothesis. Its mean and variance follow easily:
A(r) = s(T - r).
(9)
E(i\ H,) = àäÿî = Ve, E(l\ tf0) = E(r11H0) = 0,
Var(/|tf0) = Var <7| tf,) = ^
(10)
Thus we can use the results of Chapter 2, (64)-(68), with
(11)
Observe at time t=T
r(t)
Linear
system
h(r)
___________________r±
Threshold
comparison
h(T) = s(T-T), 0 <t<T,
0
elsewhere.