Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

# Elementary differential equations 7th edition - Boyce W.E

Boyce W.E Elementary differential equations 7th edition - Wiley publishing , 2001. - 1310 p.
ISBN 0-471-31999-6
Previous << 1 .. 153 154 155 156 157 158 < 159 > 160 161 162 163 164 165 .. 486 >> Next

EXAMPLE
5
Find the eigenvalues and eigenvectors of the matrix
A =
The eigenvalues k and eigenvectors x satisfy the equation (A - kI)x = 0, or
'-k
1
1 -k 1
1 1 -k
The eigenvalues are the roots of the equation
det(A — kI) =
-k
1
1
1 —k 1
1 1 -k
= -k3 + 3k + 2 = 0.
(39)
(40)
(41)
7.3 Systems of Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors 365
The roots of Eq. (41) are A. 1 — 2, A2 — —1, and A3 — — 1. Thus 2 is a simple eigenvalue, and —1 is an eigenvalue of multiplicity 2.
To find the eigenvector x(1) corresponding to the eigenvalue A1 we substitute A — 2 in Eq. (40); this gives the system
—2 1 1
1 -2 1
1 1 -2
We can reduce this to the equivalent system
'2 -1 -1
0 1 -1
,0 0 0
by elementary row operations. Solving this system we obtain the eigenvector
(42)
(43)
(1)
= I 1
(44)
For A — — 1, Eqs. (40) reduce immediately to the single equation
x1 + x2 + x3 — 0.
(45)
Thus values for two of the quantities x1, x2, x3 can be chosen arbitrarily and the third is determined from Eq. (45). For example, if x1 — 1 and x2 — 0, then x3 — -1, and
x(2)
1
(46)
is an eigenvector. Any nonzero multiple of x(2) is also an eigenvector, but a second independent eigenvector can be found by making another choice of x1 and x2; for instance, x1 — 0 and x2 — 1. Again x3 — -1 and
(3)
— -1,
(47)
is an eigenvector linearly independent of x(2). Therefore in this example two linearly independent eigenvectors are associated with the double eigenvalue.
0
An important special class of matrices, called self-adjoint or Hermitian matrices, are those for which A* — A;thatis, ~aji — atJ. Hermitian matrices include as a subclass real symmetric matrices, that is, matrices that have real elements and for which AT — A. The eigenvalues and eigenvectors of Hermitian matrices always have the following useful properties:
1. All eigenvalues are real.
2. There always exists a full set of n linearly independent eigenvectors, regardless of the multiplicities of the eigenvalues.
366
Chapter 7. Systems of First Order Linear Equations
3. If x(1) and x(2) are eigenvectors that correspond to different eigenvalues, then (x( 1), x(2)) = 0. Thus, if all eigenvalues are simple, then the associated eigenvectors form an orthogonal set of vectors.
4. Corresponding to an eigenvalue of multiplicity m, it is possible to choose m eigenvectors that are mutually orthogonal. Thus the full set of n eigenvectors can always be chosen to be orthogonal as well as linearly independent.
Example 5 above involves a real symmetric matrix and illustrates properties 1, 2, and 3, but the choice we have made for x(2) and x(3) does not illustrate property 4. However, it is always possible to choose an x(2) and x(3) so that (x(2), x(3)) = 0. For example, in Example 5 we could have chosen
as the eigenvectors associated with the eigenvalue X = —1. These eigenvectors are orthogonal to each other as well as to the eigenvector x(1) corresponding to the eigenvalue X = 2. The proofs of statements 1 and 3 above are outlined in Problems 32 and 33.
PROBLEMS In each of Problems 1 through 5 either solve the given set of equations, or else show that there is no solution.
1. x1 — x3 =0 2. x1 + 2x2 — x3 =1
3x1 + x2 + x3 =1 2x1 + x2 + x3 =1
— x1 + x2 + 2x3 =2 x1 — x2 + 2x3 =1
3. x1 + 2x2 — x3 =2 4. x1 + 2x2 — x3 =0
2x1 + x2 + x3 =1 2x1 + x2 + x3 =0
x1 — x2 + 2x3 = -1 x1 — x2 + 2x3 =0
5. x1 — x3 = 0
3x1 + x2 + x3 = 0 —x1 + x2 + 2x3 = 0
In each of Problems 6 through 10 determine whether the given set of vectors is linearly independent. If linearly dependent, find a linear relation among them. The vectors are written as row vectors to save space, but may be considered as column vectors; that is, the transposes of the given vectors may be used instead of the vectors themselves.
6. x(1) = (1, 1, 0), x(2) = (0, 1, 1), x(3) = (1, 0, 1)
7. x(1) = (2, 1, 0), x(2) = (0, 1, 0), x(3) = (-1, 2, 0)
8. x(1) = (1, 2, 2, 3), x(2) = (-1, 0, 3, 1), x(3) = (-2,-1, 1, 0),
x(4) = (-3, 0, -1, 3)
9. x(1) = (1, 2, -1, 0), x(2) = (2, 3, 1,-1), x(3) = (-1, 0, 2, 2),
x(4) = (3, -1, 1, 3)
10. x(1) = (1, 2, -2), x(2) = (3, 1, 0), x(3) = (2,-1, 1), x(4) = (4, 3, -2)
11. Suppose that the vectors x(1),..., x(m) each have n components, where n < m. Show that x(1),..., x(m) are linearly dependent.
In each of Problems 12 and 13 determine whether the given set of vectors is linearly independent for -to < t < to. If linearly dependent, find the linear relation among them. As in Problems 6 through 10 the vectors are written as row vectors to save space.
12. x(1)(f) = (e—t, 2e—t), x(2)(t) = (e—t, e—t), x(3)(t) = (3e—t, 0)
7.3 Systems of Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors 367
13. x(1)(t) = (2sin t, sin t),
14. Let
x(2) (t) = (sin t, 2 sin t)
x(2)(t) = 11
x(1)(t) = ?
Show that x(1) (t) and x(2) (t) are linearly dependent at each point in the interval 0 < t < 1. Nevertheless, show that x(1) (t) and x(2) (t) are linearly independent on 0 < t < 1.
Previous << 1 .. 153 154 155 156 157 158 < 159 > 160 161 162 163 164 165 .. 486 >> Next