Books
in black and white
Main menu
Share a book About us Home
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

Detection, Estimation modulation theory part 1 - Vantress H.

Vantress H. Detection, Estimation modulation theory part 1 - Wiley & sons , 2001. - 710 p.
ISBN 0-471-09517-6
Download (direct link): sonsdetectionestimati2001.pdf
Previous << 1 .. 141 142 143 144 145 146 < 147 > 148 149 150 151 152 153 .. 254 >> Next

1. Draw a block diagram of the minimum Pr (c) receiver.
2. Find the Pr ().
Problem 4.4.27. Binary Orthogonal Signals: Square-Law Receiver [18]. Consider the problem of transmitting two equally likely bandpass orthogonal signals with energy Et over the Rician channel defined in (416). Instead of using the optimum receiver
Signals with Unwanted Parameters 403
shown in Fig. 4.74, we use the receiver for the Rayleigh channel (i.e., let a = 0 in Fig. 4.74). Show that
Problem 4.4.28. Repeat Problem 4.4.27 for the case of M orthogonal signals.
Composite Signal Hypotheses
Problem 4.4.29. Detecting One of M Orthogonal Signals. Consider the following binary hypothesis testing problem. Under Hi the signal is one of M orthogonal signals /El 5i(/), vE2 52(0. , VEM sM(t):
Si(t) sf(t) dt = ij = 1,2,..., M.
Under Hi the ith signal occurs with probability (2i=i Pi !) Under H0 there is no signal component. Under both hypotheses there is additive white Gaussian noise with spectral height N0f2:
r(t) = VEi Si(t) + w(t), 0 < t < T with probability pt\Hu
r(t) = w(t)y 0 < t < T :H0.
1. Find the likelihood ratio test.
2. Draw a block diagram of the optimum receiver.
Problem 4.4.30 (continuation). Now assume that
Pi = = 1, 2,..., M
and
Et = E.
One method of approximating the performance of the receiver was developed in Problem 2.2.14. Recall that we computed the variance of A (not In ) on H0 and used the equation
d2 = In (1 + Var [A|tf0]). (P.l)
We then used these values of d on the ROC of the known signal problem to find PF and PD.
1. Find Var [|0].
2. Using (P.l), verify that
^ = In (1 M + Me*2). (P.2)
N0
3. For 2EjNo ? 3 verify that we may approximate (P.2) by
~ In M + In (e2 - 1). (P.3)
N0
The significance of (P.3) is that if we have a certain performance level (PFi PD) for a single known signal then to maintain the performance level when the signal is equally likely to be any one of M orthogonal signals requires an increase in the energy-to-noise ratio of In M. This can be considered as the cost of signal uncertainty.
404 4.8 Problems
4. Now remove the equal probability restriction. Show that (P.3) becomes
X~ -ln(l/')+ln<'!' ')
What probability assignment maximizes the first term? Is this result intuitively logical ?
Problem 4.4.31 (alternate continuation). Consider the special case of Problem 4.4.29 in which M = 2, ?i = E2 = Ef and px p2 = %. Define
/,[*)] = [^[ dt r(t) *(,) - *]. , = 1, 2. (P.4)
1. Sketch the optimum decision boundary in h, /2-plane for various values of 17.
2. Verify that the decision boundary approaches the asymptotes h = 277 and /2 = 2 rj.
3. Under what conditions would the following test be close to optimum.
Test. If either /1 or /2 > 217, say H is true. Otherwise say H0 is true.
4. Find PD and PF for the suboptimum test in Part 3.
Problem 4.4.32 (continuation). Consider the special case of Problem 4.4.29 in which Ei = E, 1, 2,..M and pt = 1/M, = 1, 2,..., M. Extending the definition of *)] in (P.4) to = 1, 2,..., M, we consider the suboptimum test.
Test. If one or more /t > In Mr7, say H. Otherwise say H0.
1. Define
a = Pr [/x > In M>7|si(0 is not present],
p = Pr [/1 < In MT)\sx{t) is present].
Show
PF = 1 _ (1 - a)M
and
PD = 1 - /3(1 - <x)M ~1.
2. Verify that
Pf ^ Ma
and
Pd < p.
When are these bounds most accurate?
3. Find a and p.
4. Assume that M = 1 and E/N0 gives a certain PF, PD performance. How must EjNo increase to maintain the same performance at M increases? (Assume that the relations in part 2 are exact.) Compare these results with those in Problem 4.4.30.
Problem 4.4.33. A similar problem is encountered when each of the M orthogonal signals has a random phase.
Under Hx :
r(t) = VlE f{t) cos [coct + <j>i{t) + 0(3 + w(0, 0 < t < T (with probability pi).
Under H0:
r{t) = >(0, 0 < t < T.
Signals with Unwanted Parameters 405
The signal components are orthogonal. The white noise has spectral height N0/2. The probabilities, pi9 equal 1/M, 1, 2,..., M. The phase term in each signal et is an independent, uniformly distributed random variable (0, 2tt).
Previous << 1 .. 141 142 143 144 145 146 < 147 > 148 149 150 151 152 153 .. 254 >> Next