Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

# Detection, Estimation modulation theory part 1 - Vantress H.

Vantress H. Detection, Estimation modulation theory part 1 - Wiley & sons , 2001. - 710 p.
ISBN 0-471-09517-6
Previous << 1 .. 143 144 145 146 147 148 < 149 > 150 151 152 153 154 155 .. 254 >> Next

2. Compute the Pr (c).
Problem 4.4.41. A new engineering graduate is told to design an optimum detection system for the following problem:
Hi\r(t) = s(t) + w(t), Tt < t < Tf,
H0:r(t) = ë(0, Ò³ < t < Tf.
The signal 5(0 is known. To find a suitable covariance function Kn{t, è) for the noise, he asks several engineers for an opinion.
Engineer A says
Kn(t, U)=^J &{t - u).
Engineer Â says
Knit, U) = ó Ù - è) + Kcu, è),
where Kc(t, è) is a known, square-integrable, positive-definite function.
He must now reconcile these different opinions in order to design a signal detection system.
1. He decides to combine their opinions probabilistically. Specifically,
Pr (Engineer A is correct) = PAi
Pr (Engineer Â is correct) = PB,
where Pa + PB = 1 •
(a) Construct an optimum Bayes test (threshold rj) to decide whether Hi or H0 is true.
(b) Draw a block diagram of the receiver.
(c) Check your answer for PA = 0 and PB = 0.
2. Discuss some other possible ways you might reconcile these different opinions.
408 4.8 Problems
Problem 4.4.42. Resolution. The following detection problem is a crude model of a simple radar resolution problem:
Hiir(t) = bdsd(t) + 6/5/(0 + w(t\ Tt < t < T,
H0:r(t) = 6/5/(0 + w(t), Tt < t < Tf.
1. JV7 5d(0 5/(0 dt = p.
2. 5d(0 and 5/(0 are normalized to unit energy.
3. The multipliers bd and bi are independent zero-mean Gaussian variables with variances ad2 and ñòÄ respectively.
4. The noise w(t) is white Gaussian with spectral height N0/2 and is independent of the multipliers.
Find an explicit solution for the optimum likelihood ratio receiver. You do not need to specify the threshold.
Section P.4.5. Multiple Channels.
Mathematical Derivations Problem 4.5.1. The definition of a matrix inverse kernel given in (4.434) is
Ã' Kn(t, u) Qn(«, z)du = I S(t - z).
Jt,
1. Assume that
Kn(?, ») = yl 8(( - «) + Êñ(/, è).
Show that we can write
Q„(/, è) = [I S(f - u) - K(t, è)],
where h0(/, è) is a square-integrable function. Find the matrix integral equation that h0(/, u) must satisfy.
2. Consider the problem of a matrix linear filter operating on n(0-
d(0 = fT/ h(r, u) n(«) du,
JTi
where
n(/) = nc(0 + w(0
has the covariance function given in part 1. We want to choose h(/, è) so that
I, A E Ã' [nc(0 - d(?)]T[nc(/) - d(/)] dt
JTi
is minimized. Show that the linear matrix filter that does this is the h0(/, u) found in part 1.
Problem 4.5.2 (continuation). In this problem we extend the derivation in Section 4.5 to include the case in which
Kn(/, u) = N 8(t - è) + Êc{t, w), Ò³ < t, è < Tf,
Multiple Channels 409
where N is a positive-definite matrix of numbers. We denote the eigenvalues of N as Ë³, Ë2, *.AM and define a diagonal matrix,
²ä*
ã Vå
0
ë2*
Î
AM -
To find the LRT we first perform two preliminary transformations on r as shown in Fig. P4.10.
Fig. P4.10
The matrix W is an orthogonal matrix defined in (2.369) and has the properties
Wr = W-l#
N = W_1IAW.
1. Verify that r"(t) has a covariance function matrix which satisfies (428).
2. Express / in terms of r"(t), Qn(f, u), and s"(0-
3. Prove that
Tf
I = JJ rT(0 Qn(A u) s(«) dt du,
Ò³
where
Q„(/, u) A N_1[8(f - u) - he(/, u)l and h0(/, u) satisfies the equation
CTr
Êñ(/, è) = h0(/, m)N + h0(/, z) Êc(z, u) dz, Tt < t, è < Tf.
JTX
4. Repeat part (2) of Problem 4.5.1.
Problem 4.5.3, Consider the vector detection problem defined in (4.423). Assume that Êñ(/, è) = 0 and that N is not positive-definite. Find a signal vector s(t) with total energy E and a receiver that leads to perfect detectability.
Problem 4.5.4. Let
r(0 = s(/, A) + n(0, Ò³ < t < Tf, where the covariance of n(/) is given by (425) to (428) and A is a nonrandom parameter.
1. Find the equation the maximum-likelihood estimate of A must satisfy.
2. Find the Cramer-Rao inequality for an unbiased estimate d.
3. Now assume that a is Gaussian, N(0, <r0). Find the MAP equation and the lower bound on the mean-square error.
410 4.8 Problems
Problem 4.5.5 (continuation). Let L denote a nonsingular linear transformation on a, where a is a zero-mean Gaussian random variable.
Previous << 1 .. 143 144 145 146 147 148 < 149 > 150 151 152 153 154 155 .. 254 >> Next