in black and white
Main menu
Home About us Share a book
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

An introduction to ergodic theory - Walters P.

Walters P. An introduction to ergodic theory - London, 1982. - 251 p.
Download (direct link): anintroduction1982.djvu
Previous << 1 .. 30 31 32 33 34 35 < 36 > 37 38 39 40 41 42 .. 99 >> Next

Therefore E{^AJ^n) converges in measure to E(%AJ3F) and hence
“Z E(Y.A,/&„)\ogE(yAJ.rn)
I = 1
converges in measure to — £-=, E(yAJ&) log E(yAJfF). Since all these functions are bounded by ke we know we have convergence in Ll(m) too. Therefore Н( о//.У'„) -> H{sJj.¥). □
(1) The same result holds for a decreasing sequence {&"„}? of sub-a-algebras with P]„'=, = !F.
l2) When (X,SS,m) has a countable basis the statements of Theorem 4.3 (where Я* is now an arbitrary sub-c-algebra of 3$) hold by choosing an increasing sequence {£>„}? of finite sub-algebras with s Si and using Theorem 4 7
We have the following extension of Theorem 4.4.
Theorem 4.8. Let (Х,лд,т) be и probability space and let sJ, S' be sub-a-algebras of 3% with j?/ finite. Then
(i) НШ/.?) = 0 iff s/ cz S’
(ii) //(.&//.>) = //( d\ iff .с/ and S' are independent.
Proof. Let £(.jj/) = {At,..., 4*j.
(i) If .с/ c then E(yA takes only the values 0, 1 so H(. dj:?) = 0
Conversely, if 0 = H(.pJj:¥) = j ~Yj= i E(yAJS')\ogE(yAJj')dm then
4 Entropy
since — Е{хЛ'/^)(х) log li(zAJ-?)(x) > 0 wc have that for each Е('/Лш/:?) takes only the values 0. 1. Therefore sJ с .У.
(ii) Suppose H( tJ/SF) = H{s/). Let В e &. Let Q) be the finite sub-algebra consisting of the sets {ф,В,Х\В,Х}. Then Q a SF and Hfcc/) > H{.o/ /&) > H{sJj:W) = H(s/). Hence H(.c/) = His/ /2)) so >n{A n B) = m{A)m(B) for all A e stf, by Theorem 4.4. Therefore sJ and S' arc independent.
If si and & are independent then for each A e sJ E(xA/.W) = m(A) (because jFE(xA/&)dm = ^F/Adm = miA)miF) for all F e & and Eiy^JW) is the only ^-measurable function with this property). Therefore Н(.ч/f!F) =
msf). □
§4.4 Entropy of a Measure-Preserving Transformation
Recall that if <;( s/) = Ak) then
нцыу = His/) = - X «iM.-JlogmM,-).
i = 1
The second stage of the definition of the entropy of a measure-preserving transformation T is given in the next definition. Recall that the elements of the partition ^i\/"Zo T~ls/) = V?=o T~‘~(s/) are all sets of the form
D£o T-‘Ajr S
Definition 4.9. Suppose T: X-> X is a measure-preserving transformation of the probability space (Ar.,HSjn). If .в/ is a nnite-sub-ff-algebra of ■/) then
hiT,Q{jf)) = hiT,s/) = lim - H ( \f T~'s/ ]
n — oo ^ \i = 0 /
is called the entropy of T with respect to si. (Later {in Corollary 4.9.1) we will show that the above limit always exists. In fact T~‘ /)
decreases to h(T,s/).)
This means that if we think of an application of T as a passage of one day of time, then \J"Z0‘ T~‘jt represents the combined experiment of performing the original experiment, represented by .я/, on n consecutive days. Then h(T, (/) is the average information per day that one gets from performing the original experiment daily forever.
Remark. h(T,sf) > 0.
We can now give the final stage of the definition of the entropy of a measure-preserving transformation.
$4 4 Entropy of J Me.i.sure-Preserving Transio.maUGn
Definition 4.10. If T X -* X is a measure-preserving transformation of iht probability space (Athen h(T) = suph(T,s/), where the supremum is taken over all finite aub-algebras sj of J?, is called the entropy of T. Equivalently /i(T) = sup/>(7\s) where the supremum is taken over all finite partitions of (X, in).
If, as above, we think of an application of T as a passage of one day of time then h(T) is the maximum average information per day obtainable by performing the same experiment daily.
(1) h(T)>0.h(T) could be -boo.
(2) h(idx) = 0. If h(T) = 0 then h(T,sf) = 0 for every finite sJ, which implies that V"=o T~ 'sJ does not change much as n -* со.
(3) If logarithms of some other base are used then the entropy of a * transformation is changed by a multiplicative constant that depends only on the base. Some authors use logarithms of base 2.
We shall now show the existence of the limit in Definition 4.9. We shall do this in two ways. The first method uses a simple result on sequences of real numbers and can also be applied to prove the corresponding result for topological entropy, while the second method uses properties of conditional entropy but gives a stronger result.
Theorem 4.9. If {а„}пг i is a sequence of real numbers such that an + p < a„ + ap 4n,p then lim„_a cijn exists and equals inf„ ajn. (The tiinit could be — со but if the a„ are bounded below then the limit will be non-negative.)
Proof. Fix p > 0. Each n> 0 can be written n = kp + i with 0 < i < p. Then
^/i _ -t- kp
n i + kp
As n oo then к -* со so
a{ akn а,- kap < — + — < — +
kp kp kp kp
r+ • kp p
lim — < inf —.
inf — < Hm — P и
so th-at lim ajn exists and equ^'s inf ajn.

4 Duropy
С i’n>ljjir\ 4.9.1. ij Т: X —* Л /л /ik\..s;/iv-p /ii(/ iiiul с/ is a finite suh-Ч\: <: л :her. \imn^ j ll ;;)Я1\T~' :/} exists.
Prooi . Leu;,, = //(\/"=o T~'A) > 0. Then >i ^ p ~ ]
\ 1 = 0 /
Previous << 1 .. 30 31 32 33 34 35 < 36 > 37 38 39 40 41 42 .. 99 >> Next