Download (direct link):
where A is a constant matrix and x0 a prescribed vector.
(a) Assuming that a solution x = ô (t) exists, show that it must satisfy the integral equation
Ô(0 = x0 + f Àô(â) ds. (ii)
(b) Start with the initial approximation ô(0)(0 = x0. Substitute this expression for ô^) in the right side of Eq. (ii) and obtain a new approximation ô(1) (t). Show that
ô(1() = (I + At )x0. (iii)
(c) Repeat this process and thereby obtain a sequence of approximations ô(0), ô(1), ô(2),...,ô(ë),... . Use an inductive argument to show that
ô(ë)(0 = (l + At + A222 + ••• + Àë~^ x0. (iv)
(d) Let n — to and show that the solution of the initial value problem (i) is
ô (t) = exp(At)x0. (v)
7.8 Repeated Eigenvalues
We conclude our consideration of the linear homogeneous system with constant coefficients
x' = Ax (1)
with a discussion of the case in which the matrix A has a repeated eigenvalue. Recall that in Section 7.3 we noted that a repeated eigenvalue with multiplicity k maybe associated with fewer than k linearly independent eigenvectors. The following example illustrates this possibility.
Find the eigenvalues and eigenvectors of the matrix
A=(! 1) • (2)
The eigenvalues r and eigenvectors g satisfy the equation (A — rI)g = 0, or
1—r .——1,)© - (!) (»
Chapter 7. Systems of First Order Linear Equations
The eigenvalues are the roots of the equation
det(A — rl) =
1 — r —1 1 3- r
= r2 — 4r + 4 = 0.
Thus the two eigenvalues are r1 = 2 and r2 = 2; that is, the eigenvalue 2 has multiplicity 2.
To determine the eigenvectors we must return to Eq. (3) and use for r the value 2. This gives
Hence we obtain the single condition ^ + ?2 = 0, which determines ?2 in terms of ?j, or vice versa. Thus the eigenvector corresponding to the eigenvalue r = 2 is
or any nonzero multiple of this vector. Observe that there is only one linearly independent eigenvector associated with the double eigenvalue.
Returning to the system (1), suppose that r = p is a k-fold root of the determinantal equation
det (A — rl) = 0.
Then p is an eigenvalue of multiplicity k of the matrix A. In this event, there are two possibilities: either there are k linearly independent eigenvectors corresponding to the eigenvalue p, or else there are fewer than k such eigenvectors.
In the first case, let g(1),..., g(k) be k linearly independent eigenvectors associated with the eigenvalue p of multiplicity k. Then x(1) (t) = g(1) ept,..., x(k)(t) = g(k) ept are k linearly independent solutions of Eq. (1). Thus in this case it makes no difference that the eigenvalue r = p is repeated; there is still a fundamental set of solutions of Eq. (1) of the form gert. This case always occurs if the coefficient matrix A is Hermitian.
However, if the coefficient matrix is not Hermitian, then there may be fewer than k independent eigenvectors corresponding to an eigenvalue p of multiplicity k, and if so, there will be fewer than k solutions of Eq. (1) of the form gept associated with this eigenvalue. Therefore, to construct the general solution of Eq. (1) it is necessary to find other solutions of a different form. By analogy with previous results for linear equations of order n, it is natural to seek additional solutions involving products of polynomials and exponential functions. We first consider an example.
Find a fundamental set of solutions of
x' = Ax =
and draw a phase portrait for this system.
7.8 Repeated Eigenvalues
v*vv.\S\\\\ ^ ^ ÷ \ \ \ \ \
— — ^ \ \ \ \
------— N \ \
^ ^ '-~ X \
³ ! ³
////// ////// / / / / / è
////// / / / / /
u j ’² -1
ah. ³ \ \ \ \
1 2 x
\ \ \ ^ —----
-\ \ \ 4 -----
\ \ \ \ 4 —
\ \ \ \ \ × X ^
FIGURE 7.8.1 A direction field for the system (8).
A direction field for the system (8) is shown in Figure 7.8.1. From this figure it appears that all nonzero solutions depart from the origin.