in black and white
Main menu
Share a book About us Home
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

Statistical analysis of mixture distribution - Smith A.F.M

Smith A.F.M Statistical analysis of mixture distribution - Wiley publishing , 1985. - 130 p.
ISBN 0-470-90763-4
Download (direct link): statistianalysisoffinite1985.pdf
Previous << 1 .. 53 54 55 56 57 58 < 59 > 60 61 62 63 64 65 .. 103 >> Next

150 Statistical analysis of finite mixture distributions
procedures; see. for instance, Oja (1981. 1983), Hall and Welsh (1983), and Spiegelhalter (1983).
Example 5.3.2 Mixture of two known densities Let
p(x) = 7t,/j(x) + (l - nx)f2(x)
and suppose that r^) is some statistic whose means and variances are 0iM,/i(2) and (<r*, of2), corresponding to the two components. Then, for the mixture,
Ef(X) = nlpn +(1 -nl)pt2= vart(X) = tTjtr,2, +(1 tifo^2 "T 27Tj(1 - tci)(/tfl -pt2)2 = o?(itt)
If m, denotes the sample moment corresponding to r, then, asymptotically, m, is normally distributed, with means and variances as follows:
(a) if it, = l,m,~ N{pll,n~lofi'h
(b) if re, = 0, m, ^ N(p,2, n~1 (Tf22);
(c) if 0 < 7r, < l,m,~ N\ji,{nl)tn~io?(ni)f
Using these asymptotic results, we may easily construct approximate tests of H0:7t, = 0 against H1:w1 > 0 or HqItt! = 1 against H,:7t < 1 and evaluate the power thereof (see Tiago de Oliveira, 1965).
Example 5.3.3 Mixture of two known densities which are symmetric and have equal variances
In this case, Johnson (1973) unconventionally takes, as the null hypothesis,
H0:mixture of two known densities which are symmetrical and have equal variances.
Suppose the known means are p2 and that the sample mean is x. Let
= (* -/*2)/0*i
the moment estimator of the first mixing weight n. Suppose, for chosen c,
Pj =
fj{x)6x, 7=1,2,
- 00
and that
y<=l, if Xj < c,
0, otherwise, i = 1,n.
Let 7r = iy P2)/{P, P2). Both H and n* are unbiased estimators of n and,
if H0 is true, n and n* should be similar. A possible test statistic is thus the standardized version of
a n*.
Learning about the components of a mixture
A helpful feature is that var(?t n*) is independent of it, if c = &<, +//,). Johnson (1973) gives values for the power of 5 per cent size tests of this type for the normal components case against the alternative
Hp distribution is N(/i,a2).
A special circumstance arises if p = \(M\ + because then the power remains constant as the sample size increases. Some further tests are briefly described by Johnson (1973).
Two-sample versions of the test statistics ir and n* are used by Choi (1979) to test whether or not two mixtures of the same two known component densities are identical. Comparison is made with the Wilcoxon Mann-Whitney test and the two-sample Kolmogorov-Smirnov test. If the components arc normal, the Kolmogorov-Smirnov test is less powerful than the three others, which are all comparable. With exponential components, the Kolomogorov Smirnov test again does worst, but the test based on n is also comparatively poor unless the exponential parameters differ appreciably. In view of its simplicity, the test based on ir* seems the most attractive. (The recommendation is that c should be taken to be the sample median of the combined sample a data-bascd choice!)
Example 5.3.4 Mixture of two univariate normals
H0: xj,..., xn ~ N(p,ol), independently.
H,: for some m, and some permutation r of {1,...,},
*r< 1 )*- .~ N(p,, a?), Xrtn) + ,...,xt(n) ~/V(p2,of).
The test statistic used is the maximum ratio of between sum of squares to within sum of squares,
F ,__________H|H2(X| - X;)2________
""[(it, - l)sf + (n2-Ds|](,
where the maximum is over all partitionings of the data set into two groupings and the other notation is as usual. Calculation of Fmtx in practice is eased by the fact that the optimum partitioning is given by a single cut-off in the order statistic; see Engelman and Hartigan (1969), who give critical values tm(x) for for tests of size (1 a). Approximately, for large n,
logc[Fm(a) + 1] = logc( 1 - 2/ir) + zjn - 2)~1/2 + 2.4(n - 2) l,
where za is the lOOa-percentile of the standard normal distribution.
Extension of this technique to higher dimensions and more than two components is discussed by Hartigan (1977). The in Example 5.3.4 is asymptotically normally distributed, but this may not hold in more than one dimension. Hartigan (1977) makes some conjectures about the asymptotics.
This example brings us close to cluster analysis. In Section 4.3 we encountered a clustering-based method for classification of observations and parameter
Statistical analysis of finite mixture distributions
estimation originally proposed by Scott and Symons (1971). This relied on fixing Jk, but the whole procedure can be imbedded in one of the admittedly ad hoc routines for choosing the number of clusters (see Everitt, 1980, Section 3.4).
Example 5.3.5 Mixture of two Poissons Let
p(x) = 7r,Po(x,0,) + (l -7r1)Po(x,02), x = 0, 1,..., where Po(x.O) denotes a Poisson probability mass function with mean 0. Then
E(A') = Tr,01 +(1-7002
varfX) = 71,0, + (1 7t,)02 + 2tt,(1 7t, )(0, 02)2.
Thus, if 0 < 7T, < 1, varlX) E(X) > 0, and a one-sided test based on s2 - X suggests itself, where X is the sample mean and s2 the sample variance. If there is a pure component, with mean 0,
>(s2 - X)/V(l - 2JX + IX) ~ N(0,1),
approximately (see Tiago de Oliveira, 1965).
Previous << 1 .. 53 54 55 56 57 58 < 59 > 60 61 62 63 64 65 .. 103 >> Next