Books
in black and white
Main menu
Home About us Share a book
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

Introduction to Bayesian statistics - Bolstad M.

Bolstad M. Introduction to Bayesian statistics - Wiley Publishing, 2004. - 361 p.
ISBN 0-471-27020-2
Download (direct link): introductiontobayesianstatistics2004.pdf
Previous << 1 .. 27 28 29 30 31 32 < 33 > 34 35 36 37 38 39 .. 126 >> Next

4.9 Suppose there is a medical diagnostic test for a disease. The sensitivity of the test is .95. This means that if a person has the disease, the probability that the test gives a positive response is .95. The specificity of the test is .90. This means that if a person does not have the disease, the probability that the test gives a negative response is .90, or that the false positive rate of the test is .10. In the population, 1% of the people have the disease. What is the probability that a person tested has the disease, given the results of the test is positive? Let D be the event "the person has the disease" and let T be the event "the test gives a positive result."
4.10 Suppose there is a medical screening procedure for a specific cancer that has sensitivity = .90, and specificity = .95. Suppose the underlying rate of the cancer in the population is .001. Let B be the event "the person has that specific cancer" and A be the event "the screening procedure gives a positive result."
(a) What is the probability that a person has the disease given the results of the screening is positive?
(b) Does this show that screening is effective in detecting this cancer?
5
Discrete Random Variables
In the previous chapter, we looked at random experiments in terms of events. We also introduced probability defined on events as a tool for understanding random experiments. We showed how conditional probability is the logical way to change our belief about an unobserved event, given we observed another related event. In this chapter, we introduce discrete random variables and probability distributions.
A random variable describes the outcome of the experiment in terms of a number. If the only possible outcomes of the experiment are distinct numbers separated from each other (e.g., counts), we say that the random variable is discrete. There are good reasons why we introduce random variables and their notation:
• It is quicker to describe an outcome as a random variable having a particular value than to describe that outcome in words. Any event can be formed from outcomes described by the random variable using union, intersection, and complements.
• The probability distribution of the discrete random variable is a numerical function. It is easier to deal with a numerical function than with probabilities being a function defined on sets (events). The probability of any possible event can be found from the probability distribution of the random variable using the rules of probability. So instead of having to know the probability of every possible event, we only have to know the probability distribution of the random variable.
0Introduction to Bayesian Statistics. By William M. Bolstad ISBN 0-471-27020-2 Copyright ©John Wiley & Sons, Inc.
75
76 DISCRETE RANDOM VARIABLES
Table 5.1 Typical results of rolling a fair die
Value 10 Rolls 100 Rolls Proportion After 1,000 Rolls 10,000 Rolls ... Probability
1 .1 .15 .198 .1704 ... .1666
2 .3 .08 .163 .1661 ... .1666
3 .1 .20 .156 .1670 ... .1666
4 .2 .16 .153 .1698 ... .1666
5 .0 .22 .165 .1583 ... .1666
6 .3 .19 .165 .1684 ... .1666
• It becomes much easier to deal with compound events made up from repetitions of the experiment.
5.1 DISCRETE RANDOM VARIABLES
A number that is determined by the outcome of a random experiment is called a random variable. Random variables are denoted with uppercase letters, e.g., Y. The value the random variable takes is denoted by lowercase letter, e.g., y. A discrete random variable, Y, can only take on separated values yk. There can be a finite possible number of values, for example, the random variable defined as "number of heads in n tosses of a coin" has possible values 0,1 ,...,n. Or there can be a countably infinite number of possible values, for example the random variable defined as "number of tosses until the first head" has possible values 1, 2, •••, ж . The key thing for discrete random variables is that the possible values are separated by gaps.
Thought Experiment 1: Roll of a die
Suppose we have a fair six-faced die. Our random experiment is to roll it, and we let the random variable Y be the number on the top face. There are six possible values 1,2,..., 6. Since the die is fair, those six values are equally likely. Now, suppose we take independent repetitions of the random variable, and record each occurrence of Y. Table 5.1 shows the proportion of times each face has occurred in a typical sequence of rolls of the die, after 10, 100, 1,000, and 10,000 rolls. The last column shows the true probabilities for a fair die.
We note that the proportions taking any value are getting closer and closer to the true probability of that value as n increases to ж. We could draw graphs of the proportions having each value. These are shown in Figure 5.1. The graphs are at zero for any other y value, and have a spike at each possible value where the spike height equals the proportion of times that value occurred. The sum of spike heights equals one.
Previous << 1 .. 27 28 29 30 31 32 < 33 > 34 35 36 37 38 39 .. 126 >> Next