Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

An introduction to ergodic theory - Walters P.

Walters P. An introduction to ergodic theory - London, 1982. - 251 p. Previous << 1 .. 30 31 32 33 34 35 < 36 > 37 38 39 40 41 42 .. 99 >> Next \\E(XAJSFJ-E(XAJ&)\\2^0.
Therefore E{^AJ^n) converges in measure to E(%AJ3F) and hence
ą║
ŌĆ£Z E(Y.A,/&ŌĆ×)\ogE(yAJ.rn)
I = 1
converges in measure to ŌĆö ┬Ż-=, E(yAJ&) log E(yAJfF). Since all these functions are bounded by ke we know we have convergence in Ll(m) too. Therefore ąØ( ąŠ//.ąŻ'ŌĆ×) -> H{sJj.┬ź). Ō¢Ī
Remarks
(1) The same result holds for a decreasing sequence {&"ŌĆ×}? of sub-a-algebras with P]ŌĆ×'=, = !F.
l2) When (X,SS,m) has a countable basis the statements of Theorem 4.3 (where ą»* is now an arbitrary sub-c-algebra of 3\$) hold by choosing an increasing sequence {┬Ż>ŌĆ×}? of finite sub-algebras with s Si and using Theorem 4 7
We have the following extension of Theorem 4.4.
Theorem 4.8. Let (ąź,ą╗ą┤,čé) be ąĖ probability space and let sJ, S' be sub-a-algebras of 3% with j?/ finite. Then
(i) ąØą©/.?) = 0 iff s/ cz SŌĆÖ
(ii) //(.&//.>) = //( d\ iff .čü/ and S' are independent.
Proof. Let ┬Ż(.jj/) = {At,..., 4*j.
(i) If .čü/ c then E(yA takes only the values 0, 1 so H(. dj:?) = 0
Conversely, if 0 = H(.pJj:┬ź) = j ~Yj= i E(yAJS')\ogE(yAJj')dm then
86
4 Entropy
since ŌĆö ąĢ{čģąø'/^)(čģ) log li(zAJ-?)(x) > 0 wc have that for each ąĢ('/ąøčł/:?) takes only the values 0. 1. Therefore sJ čü .ąŻ.
(ii) Suppose H( tJ/SF) = H{s/). Let ąÆ e &. Let Q) be the finite sub-algebra consisting of the sets {čä,ąÆ,ąź\ąÆ,ąź}. Then Q a SF and Hfcc/) > H{.o/ /&) > H{sJj:W) = H(s/). Hence H(.c/) = His/ /2)) so >n{A n B) = m{A)m(B) for all A e stf, by Theorem 4.4. Therefore sJ and S' arc independent.
If si and & are independent then for each A e sJ E(xA/.W) = m(A) (because jFE(xA/&)dm = ^F/Adm = miA)miF) for all F e & and Eiy^JW) is the only ^-measurable function with this property). Therefore ąØ(.čć/f!F) =
msf). Ō¢Ī
┬¦4.4 Entropy of a Measure-Preserving Transformation
Recall that if <;( s/) = Ak) then
ąĮčåčŗčā = His/) = - X ┬½iM.-JlogmM,-).
i = 1
The second stage of the definition of the entropy of a measure-preserving transformation T is given in the next definition. Recall that the elements of the partition ^i\/"Zo T~ls/) = V?=o T~ŌĆś~(s/) are all sets of the form
D┬Żo T-ŌĆśAjr S
Definition 4.9. Suppose T: X-> X is a measure-preserving transformation of the probability space (Ar.,HSjn). If .ą▓/ is a nnite-sub-ff-algebra of Ō¢Ā/) then
hiT,Q{jf)) = hiT,s/) = lim - H ( \f T~'s/ ]
n ŌĆö oo ^ \i = 0 /
is called the entropy of T with respect to si. (Later {in Corollary 4.9.1) we will show that the above limit always exists. In fact T~ŌĆś /)
decreases to h(T,s/).)
This means that if we think of an application of T as a passage of one day of time, then \J"Z0ŌĆś T~ŌĆśjt represents the combined experiment of performing the original experiment, represented by .čÅ/, on n consecutive days. Then h(T, (/) is the average information per day that one gets from performing the original experiment daily forever.
Remark. h(T,sf) > 0.
We can now give the final stage of the definition of the entropy of a measure-preserving transformation.
\$4 4 Entropy of J Me.i.sure-Preserving Transio.maUGn
87
Definition 4.10. If T X -* X is a measure-preserving transformation of iht probability space (Athen h(T) = suph(T,s/), where the supremum is taken over all finite aub-algebras sj of J?, is called the entropy of T. Equivalently /i(T) = sup/>(7\s) where the supremum is taken over all finite partitions of (X, in).
If, as above, we think of an application of T as a passage of one day of time then h(T) is the maximum average information per day obtainable by performing the same experiment daily.
(1) h(T)>0.h(T) could be -boo.
(2) h(idx) = 0. If h(T) = 0 then h(T,sf) = 0 for every finite sJ, which implies that V"=o T~ 'sJ does not change much as n -* čüąŠ.
(3) If logarithms of some other base are used then the entropy of a * transformation is changed by a multiplicative constant that depends only on the base. Some authors use logarithms of base 2.
We shall now show the existence of the limit in Definition 4.9. We shall do this in two ways. The first method uses a simple result on sequences of real numbers and can also be applied to prove the corresponding result for topological entropy, while the second method uses properties of conditional entropy but gives a stronger result.
Theorem 4.9. If {ą░ŌĆ×}ą┐ą│ i is a sequence of real numbers such that an + p < aŌĆ× + ap 4n,p then limŌĆ×_a cijn exists and equals infŌĆ× ajn. (The tiinit could be ŌĆö čüąŠ but if the aŌĆ× are bounded below then the limit will be non-negative.)
Proof. Fix p > 0. Each n> 0 can be written n = kp + i with 0 < i < p. Then
Remarks
^/i _ -t- kp
n i + kp
As n oo then ą║ -* čüąŠ so
a{ akn ą░,- kap < ŌĆö + ŌĆö < ŌĆö +
kp kp kp kp
r+ ŌĆó kp p
lim ŌĆö < inf ŌĆö.
inf ŌĆö < Hm ŌĆö P ąĖ
so th-at lim ajn exists and equ^'s inf ajn.
Ō¢Ī
4 Duropy
ąĪ iŌĆÖn>ljjir\ 4.9.1. ij ąó: X ŌĆö* ąø /ą╗ /ik\..s;/iv-p i..vn /ii(/ iiiul čü/ is a finite suh-ą¦\: <: ą╗ :her. \imn^ j ll ;;)ą»1\T~' :/} exists.
Prooi . Leu;,, = //(\/"=o T~'A) > 0. Then >i ^ p ~ ]
\ 1 = 0 / Previous << 1 .. 30 31 32 33 34 35 < 36 > 37 38 39 40 41 42 .. 99 >> Next 