Download (direct link):
w, - N(0, Q,), (6.5.2)
with mean 0 and known covariance matrix Qr Equation (6.5.1) is referred to as the system equation.
However, we cannot observe the underlying system directly; we can merely observe some aspects of the system and our observations will be further subjected to measurement error. In general, therefore, we have, in addition to (6.5.1), a measurement equation, which we assume to be of the form
x, = Hl0, + \„ (6.5.3)
where, with subscripts again denoting time, x(r x 1), r^p, is the observation,
H[r x p) is a known matrix, and v(r x 1) is the measurement noise, with a normal
v, ~ JV(0, R,), (6.5.4)
with mean 0 and known covariance matrix Rt.
From (6.5.1) and (6.5.3) it is easily shown, using Bayes’ theorem and induction, that the posterior distribution for 0,, conditional on x,,...,x„ is N(mf, C(), where
m, = A,m, _ j + C,H^R, 1 (x, — H,y4(m, _,) (6.5.5)
c;1 = HjRf'H, + (A,C, _,A}+ Q,)-'t (6.5.6)
and it is assumed that, initially, 0O ~ /V(m0, C0). The recursive equations (6.5.5)
and (6.5.6) are known as the Kalman filter updating equations (see Jazwinski, 1970).
We have already discussed, in Example 2.2.3, the use of Gaussian sums to
model non-normal noise distribution in DLMs. The following examples illustrate
other kinds of practical problems which, using variants of (6.5.1) and (6.5.3), lead to finite mixture forms.
Sequential problems and procedures
Example 6.5.1 Signal versus noise
With appropriate interpretations of 0, and x, as spatial coordinates and velocities, together with choices of A, and H, which reflect target and observer characteristics, (6.5.1) and (6.5.3) serve to model many target tracking situations. Sec, for example, Singer, Sea, and Housewright (1974), Bar-Shalom and Tse(1973), Bar-Shalom (1978), and Gauvrit (1984).
In such situations, where the observations (x,) represent radar or similar signals, there is always the possibility that a particular xr is simply ‘noise’, which may, for example, be a signal from another unrelated target whose track is of no interest. If ‘noise’ is assumed to generate an x, having an N(0,S() distribution, with S, known, then the original measurement equation defined by (6.5.3) and
(6.5.4) should be replaced by
x,~nN(Hfi„R,) + (\ -n)N(0,S,),
where the mixing weight n denotes the proportion of observations which are true signals.
Once x, has been observed, the posterior distribution for 0, is easily seen to be a mixture of two normal distributions. The first corresponds to updating on the basis of an assumed signal and has mean and covariance matrix as given by (6.5.5) and (6.5.6); the second corresponds to no updating, on the assumption that the observation was pure noise and has mean and covariance matrix m, , and C,_,. The weight on the first posterior component is the posterior probability that x, really was a signal.
By extension, given observations xxn, the corresponding posterior distribution is a mixture of 2" components, each corresponding to a particular assumed history of the process (i.e. string of signal or noise identifications).
Example 6.5.2 Tracking a manoeuvring or wandering target
A useful model for the system states of a target which can ‘manoeuvre’ (i.e. can introduce sudden changes—for example, in velocity or position in its system evolution) or ‘wander’ (i.e. is subject to externally generated perturbations to its system evolution) is obtained by extending (6.5.1) to the more general form
0, = At0t -, + B,u, + w„ (6.5.7)
where B, is a known matrix and u, represents the possible ‘jumps of the system.
In some cases, the limited manoeuvrability of the target implies that u, is one from a finite set of options (including a ‘null’ option which corresponds to no manoeuvre’). In other cases, ‘wandering’ may be caused by sudden external impulses, so that u, may be modelled as a normally distributed shock to the
For any given version of (6.5.7), recursive learning is simply accomplished using the Kalman filter updating equations. However, any uncertainty about manoeuvres or impulses implies, in effect, that (6.5.7) should be replaced by a
i j 4 Statistical analysis of finite mixture distributions
finite mixture model, with weights reflecting the relative plausibility of the various manoeuvres, etc., that might obtain.
Of course, yet more complications can be introduced! Firstly, we could combine Examples 6.5.1 and 6.5.2, so that in addition to the uncertain nature of the various possible signals that might be received, we acknowledge that we might be receiving pure noise. Secondly, knowledge of the target’s manoeuvring tactics might lead to an additional Markov chain structure on the mixing weights.
Example 6.5.3 Monitoring renal transplants
Smith el al. (1983) and Smith and West (1983) have shown that, in the context of renal transplant monitoring, a suitably transformed biochemical series can be well represented as a particular case of a DLM with possible jumps.