Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

# Detection, Estimation modulation theory part 1 - Vantress H.

Vantress H. Detection, Estimation modulation theory part 1 - Wiley & sons , 2001. - 710 p.
ISBN 0-471-09517-6
Previous << 1 .. 105 106 107 108 109 110 < 111 > 112 113 114 115 116 117 .. 254 >> Next

JTt
(172)
Observe that the Atc are the eigenvalues of the colored noise process only. (If Kc(t9 u) is not positive-definite, we augment the set to make it complete.) Then we expand r(t) in this coordinate system:
³ = 1
Ä. ^ Ê n.^
r(t) = l.i.m. T rt Ô²²) = l.i.m. Ó st Ô²t) + l.i.m. Ó ù ô^),
*—L K-+00 S'-, on
oo i = 1
Ò³ < t < Tf9 (173)
where
and
From (3.42) we know where
r, = Ã ã(0Ô³(0Æ9
JTi
st = j* ' Ve s(t) Ô³0) dt,
JTi
rTf
Hi = n(t) Ô³(0 dt.
J Ò³
E(nt) = 0, Å{ïõï,) = Ë(ã8
Ô
êò À ^ + V.
(174)
(175)
(176)
(177)
(178)
Just as on p. 252 (20) we consider the first Ê coordinates. The likelihood ratio is
' ³ ã
A Direct Derivation with a Sufficient Statistic 299
Canceling common terms, letting K-^ ñî, and taking the logarithm, we obtain
1ïË[*0]= ²óã-²²É- (180)
i = l Ai 1 i = l Ai
Using (174) and (175), we have
In Ë[ã(0] = ¥' dt ¥' du r(t) Ó tiiM) VEs(u)
JTi JTi i = i
- \ fT' dt j*' du sit) 2 ØØs(u). (181)
From (166) we recognize the sum as Qn(t, u). Thus
In Ë[ã(0] = ¥' dt ¥' du r(t) Qn(t, u)Ve s(u)
JTi JTi
- f Ã dt Ã du sit) Qnit, u) siu). (182)t
JTi
This expression is identical to (153).
Observe that if we had not gone through the whitening approach we would have simply defined Qn(t, u) to fit our needs when we arrived at this point in the derivation. When we consider more general detection problems later in the text (specifically Chapter II.3), the direct derivation can easily be extended.
4.3.3 A Direct Derivation with a Sufficient Statistic^
For convenience we rewrite the detection problem of interest (140):
r{t) = VEs(t) + n(t), Ò³ < t < Tf\Hx
= n(t\ Ò³ < t < Tf :H0. (183)
In this section we will not require that the noise contain a white component.
From our work in Chapter 2 and Section 4.2 we know that if we can write
r(t) = >4 sit) + y(t\ Ti<t<Tf. (184)
t To proceed rigorously from (181) to (182) we require 2Ã= ³ (si2!\T2) < ñî (Grenander [30]; Kelly, Reed, and Root [31]). This is always true when white noise is present.
Later, when we look at the effect of removing the white noise assumption, we shall
see that the divergence of this series leads to an unstable test. t This particular approach to the colored noise problem seems to have been developed independently by several people (Kailath [32]; Yudkin [39]). Although the two derivations are essentially the same, we follow the second.
300 4.3 Detection and Estimation in Nonwhite Gaussian Noise
where a*! is a random variable obtained by operating on r(t) and demonstrate that:
(a) r1 and y(t) are statistically independent on both hypotheses,
(b) the statistics of y(t) do not depend on which hypothesis is true,
then r1 is a sufficient statistic. We can then base our decision solely on r± and disregard y(t). [Note that conditions (a) and (b) are sufficient, but not necessary, for rx to be a sufficient statistic (see pp. 35-36).]
To do this we hypothesize that rx can be obtained by the operation
= f ' r(u) g(u) du (185)
JTi
and try to find a g(u) that will lead to the desired properties. Using (185), we can rewrite (184) as
r(0 = Ñ?! + èÎ s(t) + y(t) :H1
= n1s(t)+y(t) :H0. (186)
where
j! é Ã ' VEs(u)g(u)du (187)
JTi
and
«³ 4 f f n(u) g(u) du. (188)
JTi
Because a sufficient statistic can be multiplied by any nonzero constant and remain a sufficient statistic we can introduce a constraint,
Ãòã
I S(u)g(u) du = 1. (189a)
JTi
Using (189a) in (187), we have
ij = VE. (1896)
Clearly, nx is a zero-mean random variable and
n(t) = Ë³ s(t) + y(t), Ò³ < t < Tf. (190)
Previous << 1 .. 105 106 107 108 109 110 < 111 > 112 113 114 115 116 117 .. 254 >> Next