Books
in black and white
Main menu
Home About us Share a book
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

Introduction to Bayesian statistics - Bolstad M.

Bolstad M. Introduction to Bayesian statistics - Wiley Publishing, 2004. - 361 p.
ISBN 0-471-27020-2
Download (direct link): introductiontobayesianstatistics2004.pdf
Previous << 1 .. 31 32 33 34 35 36 < 37 > 38 39 40 41 42 43 .. 126 >> Next

The expected value of a function of the joint random variables is given by
E(h(X,Y )) = EE h(xi,yj) x f (xi,yj).
ij
Often we wish to find the expected value of a sum of random variables. In that case
E(X + Y) = EE(xi + yj) x f (xi,yj)
ij
= xi x f (xi ,yj) + EE yj x f (xi ,yj) i j i j
86 DISCRETE RANDOM VARIABLES
Table 5.3 Joint and marginal probability distributions
= E XiYl/ (xi,yj ) + E yjE/ (xi,yj)
i j j i
= E Xi X f M + E yj X / (yj ) •
ij
We see the mean of the sum of two random variables is the sum of the means.
E(X + Y) = E(X) + E(Y) • (5.10)
This equation always holds.
Independent Random Variables
Two (discrete) random variables X and Y are independent of each other if and only if every element in the j oint distribution table equals the product of the corresponding marginal distributions. In other words,
/(xi,yj) = /(xi) X /(yj)
for all possible xi and yj.
The variance of a sum of random variables is given by
Var(X + Y) = E(X + Y - E(X + Y))2
= EE(xi + yj - (E(X)+ E(Y))2 X /(xi,yj)
ij
= E EKxi - E(X)) + (yj - E(Y))]2 X /(xi, yj) •
ij
JOINT RANDOM VARIABLES 87
Multiplying this out and breaking it into three separate sums gives
Var(X + Y) = Y.T.(xi — E(X))2 x f(xuyj)
ij
+EE 2(xi — E(X ))(yj — E (Y ))f (xi,yj) ij
+ EE(yj — E(Y))2 x f(xi,yj). ij
The middle term is 2 x the covariance of the random variables. For independent random variables the covariance
Cov(X,Y) = ET,(xi — E(X)) x (yj — E(Y))f(xuyj)
ij
= E(xi — E(X))f (xi) x E(yj — E(Y))f (yj). ij
This is clearly equal to 0. Hence for independent random variables
Var(X + Y)= E(xi — E(X)))2 x f (xi) + E(yj — E(Y))2 x f (yj). ij
We see the variance of the sum of two independent random variables is the sum of the variances.
Var(X + Y) = Var(X) + Var(Y). (5.11)
This equation only holds for independent1 random variables!
Example 7 Let X and Y be jointly distributed discrete random variables. Their joint probability distribution is given in the following table:
Y f (x)
1 2 3 4
1 .02 .04 06 .08
X 2 .03 .01 09 .17
3 .05 .15 .15 .15
f(y)
We find the marginal distributions of X and Y by summing across the rows and summing down the columns respectively. That gives the table
1In general, the variance of a sum of two random variables is given by Var(X + Y) = Var(X) +2 X Cov(X, Y) + Var(Y).
88 DISCRETE RANDOM VARIABLES
1 Y 2 3 4 f (x)
1 .02 .04 06 .08 .2
X 2 .03 .01 09 .17 .3
3 .05 .15 .15 .15 .5
f(y) .1 .2 .3 .4
We see that the joint probability f (xi, yj) is not always equal to the product of the marginal probabilities f (xi) x f (yj). Therefore the two random variables X and Y are not independent.
Mean and variance of a difference between two independent random variables. When we combine the results of Equations 5.10 and 5.11 with the results of Equations 5.4 and 5.5, we find the that mean of a difference between random variables is
E(X - Y) = E(X) - E(Y). (5.12)
If the two random variables are independent, we find that the variance of their difference is
Var(X - Y) = Var(X) + Var(Y). (5.13)
Variability always adds for independent random variables, regardless of whether we are taking the sum or taking the difference.
5.6 CONDITIONAL PROBABILITY FOR JOINT RANDOM VARIABLES
If we are given Y = yj, the reduced universe is the set of ordered pairs where the second element is yj. This is shown in Table 5.4. It is the only part of the universe that remains, given Y = yj. The only part of the event X = xi that remains is the part in the reduced universe. This is the intersection of the events X = xi and Y = yj. Table 5.5 shows the original joint probability function in the reduced universe, and the marginal probability. We see that this is not a probability distribution. The sum of the probabilities in the reduced universe sums to the marginal probability, not to one!
The conditional probability that random variable X = xi, given Y = yj is the probability of the intersection of the events X = xi and Y = yj divided by the probability that Y = yj from Equation 4.1. Dividing the joint probability by the marginal probability scales it up so the probability of the reduced universe equals 1. The conditional probability is given by
f (xi |yj) = P(X = xi|Y = yj) = P^/ilYj yj) . (5.14)
When we put this in terms of the joint and marginal probability functions we get
f (xi | yj ) = fj . (5.15)
CONDITIONAL PROBABILITY FOR JOINT RANDOM VARIABLES 89
Table 5.4 Reduced universe given Y = yj
(xi,yj)
(xi, yj )
(x1 ;yj)
Table 5.5 Joint probability function values in the reduced universe Y = yj. The marginal probability is found by summing down the column.
. . . . / (x1,yj) ....
/ (xi ,yj)
/ (xi ,yj)
/ (yj)
The conditional probability distribution. Letting xi vary across all possible values of X gives us the conditional probability distribution of X |Y = yj. The conditional probability distribution is defined on the reduced universe given Y = yj. The conditional probability distribution is shown in Table 5.6. Each entry was found by dividing the ij entry in the joint probability table by jth element in the marginal probability. The marginal probability / (yj) = J2i / (xi; yj), and is found by summing down the jth column of the joint probability table. So the conditional probability of xi given yj is the jth column in the joint probability table, divided by the sum of the joint probabilities in the jth column.
Previous << 1 .. 31 32 33 34 35 36 < 37 > 38 39 40 41 42 43 .. 126 >> Next