Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

# Elementary differential equations 7th edition - Boyce W.E

Boyce W.E Elementary differential equations 7th edition - Wiley publishing , 2001. - 1310 p.
ISBN 0-471-31999-6
Previous << 1 .. 287 288 289 290 291 292 < 293 > 294 295 296 297 298 299 .. 486 >> Next

If pointwise convergence is replaced by mean convergence, Theorem 11.2.4 can be considerably generalized. Before we state such a companion theorem to Theorem
11.2.4 , we first define what is meant by a square integrable function. A function f is said to be square integrable on the interval 0 < x < 1 if both f and f2 are integrable11 on
11For the Riemann integral used in elementary calculus the hypotheses that f and f2 are integrable are independent; that is, there are functions such that f is integrable but f2 is not, and conversely (see Problem 6). A generalized integral, known as the Lebesgue integral, has the property (among others) that if f2 is integrable, then f is also necessarily integrable. The term square integrable came into common use in connection with this type of integration.
11.6 Series of Orthogonal Functions: Mean Convergence
673
Theorem 11.6.1
EXAMPLE
1
that interval. The following theorem is similar to Theorem 11.2.4 except that it involves mean convergence.
The eigenfunctions of the Sturm-Liouville problem (11), (12) are complete with respect to mean convergence for the set of functions that are square integrable on 0 < x < 1. In other words, given any square integrable function f, the series (10), whose coefficients are given by Eq. (9), converges to f (x) in the mean square sense.
It is significant that the class of functions specified in Theorem 11.6.1 is very large indeed. The class of square integrable functions contains some functions with many discontinuities, including some kinds of infinite discontinuities, as well as some functions that are not differentiable at any point. All these functions have mean convergent expansions in the eigenfunctions of the boundary value problem (11), (12). However, in many cases these series do not converge pointwise, at least not at every point. Thus mean convergence is more naturally associated with series of orthogonal functions, such as eigenfunctions, than ordinary pointwise convergence.
The theory of Fourier series discussed in Chapter 10 is just a special case of the general theory of Sturm-Liouville problems. For instance, the functions
(x) = \f2sinnn x (13)
are the normalized eigenfunctions of the Sturm-Liouville problem
f + Xy = 0, y(0) = 0, y(1) = 0. (14)
Thus, if f is a given square integrable function on 0 < x < 1, then according to Theorem 11.6.1, the series
CO CO
f(x) = T.O.*. (x) = v/2^^ bm sin mn x, (15)
m=1 m=1
where
bm = f f (x)\$m (x) dx = V2 f f (x) sin mn x dx, (16)
J0 J0
converges in the mean. The series (15) is precisely the Fourier sine series discussed in Section 10.4. If f satisfies the further conditions stated in Theorem 11.2.4, then this series converges pointwise as well as in the mean. Similarly, a Fourier cosine series is associated with the Sturm-Liouville problem
/ + Xy = 0, / (0) = 0, y (1) = 0. (17)
Let f (x) = 1 for 0 < x < 1. Expand f (x) using the eigenfunctions (13) and discuss the pointwise and mean square convergence of the resulting series.
The series has the form (15) and its coefficients bm are given by Eq. (16). Thus
f 1 V2
bm = v/2 I sin mnx dx = ------------(1 — cos mn) (18)
m ' mn
674
Chapter 11. Boundary Value Problems and Sturm Liouville Theory
and the nth partial sum of the series is
„-A 1 - cosmn .
Sn (x) = 2 } sin mn x.
“ mn
m= 1
The mean square error is then
=
/1
Jo
[ f(x) - Sn(x)]2 dx.
(19)
(20)
By calculating Rn for several values of n and plotting the results, we obtain Figure
11.6.2. This figure indicates that Rn steadily decreases as n increases. Of course, Theorem 11.6.1 asserts that Rn ^ 0 as n ^to. Pointwise, we know that Sn(x) ^ f (x) = 1 as n ^ to; further, Sn(x) has the value zero for x = 0 or x = 1 for every n. Although the series converges pointwise for each value of x, the least upper bound of the error does not diminish as n increases. For each n there are points close to x = 0 and x = 1 where the error is arbitrarily close to 1.
vn 0.20
16 n
FIGURE 11.6.2 Dependence of the mean square error Rn on n in Example 1.
Theorem 11.6.1 can be extended to cover self-adjoint boundary value problems having periodic boundary conditions, such as the problem
/ + = 0, (21)
y{-L) - y(L) = °, y(-L) - /(L) = 0 (22)
considered in Example 4 of Section 11.2. The eigenfunctions of the problem (21), (22) are 0n(x) = cos(nnx/L) for n = 0, 1, 2,... and ^ (x) = sin(nnx/L) for n = 1, 2,.... If f is a given square integrable function on - L < x < L, then its expansion in terms of the eigenfunctions 0n and ^ is of the form
a0 / nn x nn x \ , ,
f (x) = -20 + ^2 [an cos l + bn sin l ^ , (23)
11.6 Series of Orthogonal Functions: Mean Convergence
675
where
n = 0, 1, 2,..., (24)
n = 1, 2,.... (25)
This expansion is exactly the Fourier series for f discussed in Sections 10.2 and 10.3.
According to the generalization of Theorem 11.6.1, the series (23) converges in the
mean for any square integrable function f, even though f may not satisfy the conditions of Theorem 10.3.1, which assure pointwise convergence.
Previous << 1 .. 287 288 289 290 291 292 < 293 > 294 295 296 297 298 299 .. 486 >> Next