Books
in black and white
Main menu
Share a book About us Home
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

Elementary differential equations 7th edition - Boyce W.E

Boyce W.E Elementary differential equations 7th edition - Wiley publishing , 2001. - 1310 p.
ISBN 0-471-31999-6
Download (direct link): elementarydifferentialequat2001.pdf
Previous << 1 .. 66 67 68 69 70 71 < 72 > 73 74 75 76 77 78 .. 486 >> Next

In Example 5 of Section 3.2 we verified that y1 (t) = t1/2 and y2(t) = t 1 are solutions of the equation
2t2y" + 3ty' y = 0, t > 0. (14)
Verify that the Wronskian of y1 and y2 is given by Eq. (13).
From the example just cited we know that W(yv y2)(t) = (3/2)t3/2. To use Eq. (13) we must write the differential equation (14) in the standard form with the coefficient of y" equal to 1. Thus we obtain
y" + 3 y'--------2 y = 0,
2t 2t2
so p(t) = 3/2t. Hence
W(yv y2)(t) = c exp
/
3
dt 2t
3
= c exp ( - - ln t
= cr3/2. (15)
Equation (15) gives the Wronskian of any pair of solutions of Eq. (14). For the particular solutions given in this example we must choose c = -3/2.
A stronger version of Theorem 3.3.1 can be established if the two functions involved are solutions of a second order linear homogeneous differential equation.
Let y1 and y2 be the solutions of Eq. (7),
L [y] = y + p(t) / + q (t) y = 0,
where p and q are continuous on an open interval I. Then y1 and y2 are linearly dependent on I if and only if W(y1, y2)(t) is zero for all t in I. Alternatively, y1 and y2 are linearly independent on I if and only if W (y1, y2)(t) is never zero in I.
Of course, we know by Theorem 3.3.2 that W (y1, y2)(t) is either everywhere zero or nowhere zero in I. In proving Theorem 3.3.3, observe first that if y1 and y2 are linearly
3.3 Linear Independence and the Wronskian
151
dependent, then W(y1, y2)(t) is zero for all t in I by Theorem 3.3.1. It remains to prove the converse; that is, if W(y1, y2)(t) is zero throughout I, then y1 and y2 are linearly dependent. Let t0 be any point in I; then necessarily W(y1; y2)(t0) = 0. Consequently, the system of equations
for c1 and c2 has a nontrivial solution. Using these values of c1 and c2, let 0(t) = c1y1(t) + c2y2(t). Then 0 is a solution of Eq. (7), and by Eqs. (16) 0 also satisfies the initial conditions
Therefore, by the uniqueness part of Theorem 3.2.1, or by Example 2 of Section 3.2, 0(t) = 0 for all t in I. Since 0 (t) = c1 y1 (t) + c2y2(t) with c1 and c2 not both zero, this means that y1 and y2 are linearly dependent. The alternative statement of the theorem follows immediately.
We can now summarize the facts about fundamental sets of solutions, Wronskians, and linear independence in the following way. Let y1 and y2 be solutions of Eq. (7),
where p and q are continuous on an open interval I. Then the following four statements are equivalent, in the sense that each one implies the other three:
1. The functions y1 and y2 are a fundamental set of solutions on I.
2. The functions y1 and y2 are linearly independent on I.
3. W(y1, y2)(t0) = 0 for some t0 in I.
4. W(ylt y2)(t) = 0 for all t in I.
It is interesting to note the similarity between second order linear homogeneous differential equations and two-dimensional vector algebra. Two vectors a and b are said to be linearly dependent if there are two scalars k1 and k2, not both zero, such that k1a + k2b = 0; otherwise, they are said to be linearly independent. Let i and j be unit vectors directed along the positive x and y axes, respectively. Since k1i + k2j = 0 only if k1 = k2 = 0, the vectors i and j are linearly independent. Further, we know that any vector a with components a1 and a2 can be written as a = a1i + a2j, that is, as a linear combination of the two linearly independent vectors i and j. It is not difficult to show that any vector in two dimensions can be expressed as a linear combination of any two linearly independent two-dimensional vectors (see Problem 14). Such a pair of linearly independent vectors is said to form a basis for the vector space of two-dimensional vectors.
The term vector space is also applied to other collections of mathematical objects that obey the same laws of addition and multiplication by scalars that geometric vectors do. For example, it can be shown that the set of functions that are twice differentiable on the open interval I forms a vector space. Similarly, the set V of functions satisfying Eq. (7) also forms a vector space.
Since every member of V can be expressed as a linear combination of two linearly independent members y1 and y2, we say that such a pair forms a basis for V. This leads to the conclusion that V is two-dimensional; therefore, it is analogous in many respects to the space of geometric vectors in a plane. Later we find that the set of solutions of an
C1 y1(t0) + c2y2(t0) = 0 C1 y1 (to) + c2 y2 (to) = 0
(16)
0(to) = 0, 0 '(t0) = 0.
(17)
y" + p(t)y + q (t)y = 0,
152
Chapter 3. Second Order Linear Equations
PROBLEMS
nth order linear homogeneous differential equation forms a vector space of dimension n, and that any set of n linearly independent solutions of the differential equation forms a basis for the space. This connection between differential equations and vectors constitutes a good reason for the study of abstract linear algebra.
In each of Problems 1 through 8 determine whether the given pair of functions is linearly
independent or linearly dependent.
Previous << 1 .. 66 67 68 69 70 71 < 72 > 73 74 75 76 77 78 .. 486 >> Next