Books
in black and white
Main menu
Share a book About us Home
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

Detection, Estimation modulation theory part 1 - Vantress H.

Vantress H. Detection, Estimation modulation theory part 1 - Wiley & sons , 2001. - 710 p.
ISBN 0-471-09517-6
Download (direct link): sonsdetectionestimati2001.pdf
Previous << 1 .. 11 12 13 14 15 16 < 17 > 18 19 20 21 22 23 .. 254 >> Next

There are several special kinds of Bayes test which are frequently used and which should be mentioned explicitly.
If we assume that C00 and CX1 are zero and C01 = C10 = 1, the expression for the risk in (8) reduces to
& = Po />r,H0(R|tfo) dR + /^) dR. (39)
We see that (39) is just the total probability of making an error. Therefore for this cost assignment the Bayes test is minimizing the total probability of error. The test is
When the two hypotheses are equally likely, the threshold is zero. This assumption is normally true in digital communication systems. These processors are commonly referred to as minimum probability of error receivers.
A second special case of interest arises when the a priori probabilities are unknown. To investigate this case we look at (8) again. We observe that once the decision regions Z0 and Zx are chosen, the values of the integrals are determined. We denote these values in the following manner:
#i:Pr (n events) = e'mi, n = 0,1, 2,...
n\
(35)
tf0:Pr (n events) = <rmo, n = 0,1, 2,...
(36)
Then the likelihood ratio test is
(37)
or, equivalently,
"1 in rj + mi m0 HO In /ϳ - In Wo *
if mi > m0,
(38)
"0 In + mi m0 hi In mi In m0
if m0 > mi.
In A(R) % In ^ = In P0 - In (1 - Po).
Ho
(40)
Likelihood Ratio Tests 31
Pf= f />r,H0(R|tfo)c/R,
Jzx
Pd f /7rlHl(R|^) dR, (41)
JZi
PM = f PrlMl(^\Hi)dR= 1 -PD.
J Zq
We see that these quantities are conditional probabilities. The subscripts are mnemonic and chosen from the radar problem in which hypothesis Hx corresponds to the presence of a target and hypothesis H0 corresponds to its absence. PF is the probability of a false alarm (i.e., we say the target is present when it is not); PD is the probability of detection (i.e., we say the target is present when it is); PM is the probability of a miss (we say the target is absent when it is present). Although we are interested in a much larger class of problems than this notation implies, we shall use it for convenience.
For any choice of decision regions the risk expression in (8) can be written in the notation of (41):
ft = P0CX0 +* г + Pi(Coi CXX)PM
- P0(C10 - CooXl - Pf). (42)
Because
, = 1 - Pu (43)
(42) becomes
ft(Pi) = C00(l Pf) "b C10PF
+ [( Coo) + ( ) ( C0o)Pi?]. (44)
Now, if all the costs and a priori probabilities are known, we can find a Bayes test. In Fig. 2.1a we plot the Bayes risk, ftfi(Pi), as a function of Px. Observe that as Px changes the decision regions for the Bayes test change and therefore PF and PM change.
Now consider the situation in which a certain Px (say Pt = Pf) is assumed and the corresponding Bayes test designed. We now fix the threshold and assume that Px is allowed to change. We denote the risk for this fixed threshold test as &F(PX, Px). Because the threshold is fixed, PF and PM are fixed, and (44) is just a straight line. Because it is a Bayes test for Px = Pf, it touches the () curve at that point. Looking at (14), we see that the threshold changes continuously with Px. Therefore, whenever Px Pf, the threshold in the Bayes test will be different. Because the Bayes test minimizes the risk,
(45)
32 2.2 Simple Binary Hypothesis Tests
5t 5t
51 51
Fig. 2.7 Risk curves: (a) fixed risk versus typical Bayes risk; (b) maximum value of
5lx at = 0.
If A is a continuous random variable with a probability distribution function that is strictly monotonic, then changing 77 always changes the risk. is strictly concave downward and the inequality in (45) is
strict. This case, which is one of particular interest to us, is illustrated in Fig. 2.1a. We see that ftF(P*, Pi) is tangent to &B(Pi) at P1 = Pf. These curves demonstrate the effect of incorrect knowledge of the a priori probabilities.
An interesting problem is encountered if we assume that the a priori probabilities are chosen to make our performance as bad as possible. In other words, Px is chosen to maximize our risk &F(P*, Pi). Three possible examples are given in Figs. 2.1b, , and d. In Fig. 2.1b the maximum of ^() occurs at Pi = 0. To minimize the maximum risk we use a Bayes test designed assuming Px = 0. In Fig. 2.1c the maximum of &B(Pi) occurs at Px = 1. To minimize the maximum risk we use a Bayes test designed assuming Pi = 1. In Fig. 2.1 d the maximum occurs inside the interval
Likelihood Ratio Tests 33
Previous << 1 .. 11 12 13 14 15 16 < 17 > 18 19 20 21 22 23 .. 254 >> Next