Books
in black and white
Main menu
Share a book About us Home
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

Detection, Estimation modulation theory part 1 - Vantress H.

Vantress H. Detection, Estimation modulation theory part 1 - Wiley & sons , 2001. - 710 p.
ISBN 0-471-09517-6
Download (direct link): sonsdetectionestimati2001.pdf
Previous << 1 .. 105 106 107 108 109 110 < 111 > 112 113 114 115 116 117 .. 254 >> Next

JTt
(172)
Observe that the Atc are the eigenvalues of the colored noise process only. (If Kc(t9 u) is not positive-definite, we augment the set to make it complete.) Then we expand r(t) in this coordinate system:
= 1
. ^ n.^
r(t) = l.i.m. T rt Բ) = l.i.m. st Բt) + l.i.m. ^),
*L K-+00 S'-, on
oo i = 1
ҳ < t < Tf9 (173)
where
and
From (3.42) we know where
r, = (0Գ(09
JTi
st = j* ' Ve s(t) Գ0) dt,
JTi
rTf
Hi = n(t) Գ(0 dt.
J ҳ
E(nt) = 0, {,) = (8

^ + V.
(174)
(175)
(176)
(177)
(178)
Just as on p. 252 (20) we consider the first coordinates. The likelihood ratio is
'
A Direct Derivation with a Sufficient Statistic 299
Canceling common terms, letting K-^ , and taking the logarithm, we obtain
1[*0]= -- (180)
i = l Ai 1 i = l Ai
Using (174) and (175), we have
In [(0] = ' dt ' du r(t) tiiM) VEs(u)
JTi JTi i = i
- \ fT' dt j*' du sit) 2 s(u). (181)
From (166) we recognize the sum as Qn(t, u). Thus
In [(0] = ' dt ' du r(t) Qn(t, u)Ve s(u)
JTi JTi
- f dt du sit) Qnit, u) siu). (182)t
JTi
This expression is identical to (153).
Observe that if we had not gone through the whitening approach we would have simply defined Qn(t, u) to fit our needs when we arrived at this point in the derivation. When we consider more general detection problems later in the text (specifically Chapter II.3), the direct derivation can easily be extended.
4.3.3 A Direct Derivation with a Sufficient Statistic^
For convenience we rewrite the detection problem of interest (140):
r{t) = VEs(t) + n(t), ҳ < t < Tf\Hx
= n(t\ ҳ < t < Tf :H0. (183)
In this section we will not require that the noise contain a white component.
From our work in Chapter 2 and Section 4.2 we know that if we can write
r(t) = >4 sit) + y(t\ Ti<t<Tf. (184)
t To proceed rigorously from (181) to (182) we require 2= (si2!\T2) < (Grenander [30]; Kelly, Reed, and Root [31]). This is always true when white noise is present.
Later, when we look at the effect of removing the white noise assumption, we shall
see that the divergence of this series leads to an unstable test. t This particular approach to the colored noise problem seems to have been developed independently by several people (Kailath [32]; Yudkin [39]). Although the two derivations are essentially the same, we follow the second.
300 4.3 Detection and Estimation in Nonwhite Gaussian Noise
where a*! is a random variable obtained by operating on r(t) and demonstrate that:
(a) r1 and y(t) are statistically independent on both hypotheses,
(b) the statistics of y(t) do not depend on which hypothesis is true,
then r1 is a sufficient statistic. We can then base our decision solely on r and disregard y(t). [Note that conditions (a) and (b) are sufficient, but not necessary, for rx to be a sufficient statistic (see pp. 35-36).]
To do this we hypothesize that rx can be obtained by the operation
= f ' r(u) g(u) du (185)
JTi
and try to find a g(u) that will lead to the desired properties. Using (185), we can rewrite (184) as
r(0 = ?! + s(t) + y(t) :H1
= n1s(t)+y(t) :H0. (186)
where
j! ' VEs(u)g(u)du (187)
JTi
and
4 f f n(u) g(u) du. (188)
JTi
Because a sufficient statistic can be multiplied by any nonzero constant and remain a sufficient statistic we can introduce a constraint,

I S(u)g(u) du = 1. (189a)
JTi
Using (189a) in (187), we have
ij = VE. (1896)
Clearly, nx is a zero-mean random variable and
n(t) = ˳ s(t) + y(t), ҳ < t < Tf. (190)
Previous << 1 .. 105 106 107 108 109 110 < 111 > 112 113 114 115 116 117 .. 254 >> Next