Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

# Elementary differential equations 7th edition - Boyce W.E

Boyce W.E Elementary differential equations 7th edition - Wiley publishing , 2001. - 1310 p.
ISBN 0-471-31999-6
Previous << 1 .. 150 151 152 153 154 155 < 156 > 157 158 159 160 161 162 .. 486 >> Next

0 0 2 I
2 3 1
1-1 2 1)
V 4 -1 -1
( 1 -1 2 o\
-1 2 -4 2
1 0 1 3
-2 2 0 -1
20. Prove that if there are two matrices B and C such that AB = I and AC = I, then B = C. This shows that a matrix A can have only one inverse.
3e2t \ e2 I, find
A(t) dt
( et 2e-t e2t\ ( 2e e
21. If A(t) = I 2et e-t -e2' \ and B(t) = I-e' 2e-
\-et 3e-t 2e2t/ y 3e' - e-
(a) A + 3B (b) AB f 1
(c) dA/dt (d) 1J
In each of Problems 22 through 24 verify that the given vector satisfies the given differential equation.
22. x' =
2 *
23. x' =
x +
-! -
x
te1
24. x' =
1
6\
-8 \ e-t + 2
0
1 \ e2t
x
x
x
7.3 Systems of Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors 357
In each of Problems 25 and 26 verify that the given matrix satisfies the given differential equation.
25. W' = 1
4 —2_
/1 —1
26. W' = I3 2
2 1
W, W(t) =
e
—4e
e2t
e2t
4\
—11 w,
( et e—2t e3^
W(t) = I — 4et —e—2t 2e3t
\ _et -e—2t e3t)
7.3 Systems of Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors
®In this section we review some results from linear algebra that are important for the solution of systems of linear differential equations. Some of these results are easily proved and others are not; since we are interested simply in summarizing some useful information in compact form, we give no indication of proofs in either case. All the results in this section depend on some basic facts about the solution of systems of linear algebraic equations.
Systems of Linear Algebraic Equations. A set of n simultaneous linear algebraic equations in n variables,
a11 X1 + a12 X2 + •" + a1nXn = b1>
. (1)
an1 X1 + an2 X2 + •" + annXn = bn,
can be written as
Ax = b, (2)
where the n x n matrix A and the vector b are given, and the components of x are to be determined. If b = 0, the system is said to be homogeneous; otherwise, it is nonhomogeneous.
If the coefficient matrix A is nonsingular, that is, if det A is not zero, then there is a unique solution of the system (2). Since A is nonsingular, A—1 exists, and the solution can be found by multiplying each side of Eq. (2) on the left by A—1; thus
x = A—1b. (3)
In particular, the homogeneous problem Ax = 0, corresponding to b = 0 in Eq. (2), has only the trivial solution x = 0.
On the other hand, if A is singular, that is, if det A is zero, then solutions of Eq. (2)
either do not exist, or do exist but are not unique. Since A is singular, A—1 does not
exist, so Eq. (3) is no longer valid. The homogeneous system
Ax 0
(4)
358
Chapter 7. Systems of First Order Linear Equations
EXAMPLE
1
has (infinitely many) nonzero solutions in addition to the trivial solution. The situation for the nonhomogeneous system (2) is more complicated. This system has no solution unless the vector b satisfies a certain further condition. This condition is that
(b, y) = 0,
(5)
for all vectors y satisfying A*y = 0, where A* is the adjoint of A. If condition (5) is met, then the system (2) has (infinitely many) solutions. Each of these solutions has the form
x = x(0) + g,
(6)
where x(0) is a particular solution of Eq. (2), and ? is any solution of the homogeneous system (4). Note the resemblance between Eq. (6) and the solution of a nonhomogeneous linear differential equation. The proofs of some of the preceding statements are outlined in Problems 25 through 29.
The results in the preceding paragraph are important as a means of classifying the solutions of linear systems. However, for solving particular systems it is generally best to use row reduction to transform the system into a much simpler one from which the solution(s), if there are any, can be written down easily. To do this efficiently we can form the augmented matrix
(a
A | b =
ii
J1n
\an1
(7)
ann 1 V
by adjoining the vector b to the coefficient matrix A as an additional column. The dashed line replaces the equals sign and is said to partition the augmented matrix. We now perform row operations on the augmented matrix so as to transform A into a triangular matrix, that is, a matrix whose elements below the main diagonal are all zero. Once this is done, it is easy to see whether the system has solutions, and to find them if it does. Observe that elementary row operations on the augmented matrix (7) correspond to legitimate operations on the equations in the system (1). The following examples illustrate the process.
Solve the system of equations
*1 - 2 *2 + OJ to* II 7,
-*1 + to* 1 II <N -5,
2*1 1 1 II 4.
the system (8) is
( 1 -2 3 | 7
(-1 1 - 2| -5
2 -1 - -1 1 4
(8)
(9)
We now perform row operations on the matrix (9) with a view to introducing zeros in the lower left part of the matrix. Each step is described and the result recorded below.
7.3 Systems of Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors 359
EXAMPLE
2
(a) Add the first row to the second row and add (-2) times the first row to the third row.
1 -2 0 -1 ,0 3
(b) Multiply the second row by — 1.
Previous << 1 .. 150 151 152 153 154 155 < 156 > 157 158 159 160 161 162 .. 486 >> Next