Books
in black and white
Main menu
Home About us Share a book
Books
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics
Ads

An introduction to ergodic theory - Walters P.

Walters P. An introduction to ergodic theory - London, 1982. - 251 p.
Download (direct link): anintroduction1982.djvu
Previous << 1 .. 28 29 30 31 32 33 < 34 > 35 36 37 38 39 40 .. 99 >> Next

As mentioned above, //(-/) is a measure of the uncertainly removed (or information gained) by performing the experiment with outcomes
{A i.....Лк).
Remarks
(1) If .с/ = {Х,ф} then !!(.':■/) = 0. Here я/ represents the outcomes of a “certain" experiment so there is no uncertainty about the outcome.
(2) If c;(.c/) = ■[/!,, . . ,/4^| where т(А() = 1 //< V/ then
* 1 1
/-/(V) = - Y f loS7 = log A'.
1=1 л
We shall show later (Corollary 4.2.1) that log/ч is the maximum value for the entropy of a partition with к sets. The greatest uncertainty about the outcome should occur wnen the outcomes are equally likely.
|3) //(.*/)> 0.
(4| If .с/ = % then H(.*)= HIV).
(5) If T: X -* X is measure-preserving then H(T~ x .•:/) = //(.;/).
Several properties of entropy are implied by the following elementary result.
$4.2 Enlropy of a Partition
79
Theorem 4.2. The function <^:[0, oo) -* R defined by
is strictly convex, i.e., ф(осх + fly) < аф{х) + Рф(у) if x, ye [0, oo), a, [1 > 0, a + P = 1; with equality only when x — у or a = 0 or p = 0.
A
if x, e [0, oo), a, > 0, i a, = 1; cind equality holds only when all the xh corresponding to non-zero a„ are equal.
Proof. We have
Fix a, P with a > 0, P > 0. Suppose >' > *• By the mean value theorem Ф(У) — ф(осх + Py) = ф'(г)а(у — x) for some z with ay + Py < z < у and
ф(ах + Py) — ф(х) — ф'{w)P(y — x) for some w with x < w < ax + Py. Since ф" > 0, we have ф'(г) > ф\ы) and hence
Therefore ф("хх 4- Py) < аф(х) + рф(у) if x, у > 0. It clearly holds a'so if
By induction we get
ф'(х) = 1 + logx
ф"(х) = - > 0 on (0, oo).
x
Р(Ф(у) - Ф(ах + Ру)) = Ф'^)аР(у - x) > ф'(^)аР(у - x)
= а(ф(х + py) - ф(х)).
v, у > 0 and x Ф у.

4 Entropy
Corollar\ 4.2.1. If £ = {/I,, . . then H(i) < log k, and H(c,) = log/с only wh.cn ni(A ) = 1 ,'k (or all i.
S'4.3 Conditional Entropy
Conditional entropy is not required in order to give the definition of the entropy of a transformation. It is useful in deriving properties of entropy, and we discuss it now before we consider the entropy of a transformation. Let V, rG be finite sub-c-algebras of .if! and
omitting the j-lerms when m[C,) = 0.
So to gel H(.one considers Cj as a measure space with normauzed measure and calculates the entropy of the partition of the set
C. induced by £(jf) (this gives
and then averages the answer taking into account the size of Cj. (H{slfdt) measures the uncertainty about the outcome of si given that we will be told the outcome of V.)
Let . Г denote the a-held {ф,Х}. Th^Hitl/jV) = H(sl). (Sincc V represents the outcome of the trivial experiment one gains nothing from knowledge of it.)
Proof. Put a, = l/k and x, = »»(/!,), 1 < / < k.

Definition 4.7. The entropy of si given T, is the number
m\L.j) 1Щ L,j)
Remarks
(1) /K.d/Г) > 0.
(2) If.d = 3 then tl(-tf/r4) = H(3/V).
(3) If 9? = 9 then H(s!/V) = НЫ/Щ.
$4 3 Conditional Entropy
SI
Theorem 4.3. Let (A\/A,m) be a probability space. If s/, r<?, are finite subahjebras of :'A then:
' (i) H{s/ v V/9) = H(s//3) + H(%'jsJ v Si).
(u) Hi.a'vV) = His/) + H(V/sS).
(iii) sJ £ V => ЩлГ/Sf) < HW/9).
(iv) sJ £ => H(s/) < H{<#)
(v) <G с » => H(j*/V) > H{sJ/SH>).
(vi) H(stf) > H{sS/k).
(vii) H(sfvV/9) < H(s//9) + H{V/2)
(viii) H{sJ v V) < H{s/) + H(V).
(ix) If T is measure-preserving then:
H{T-ls//T-'<£) = H(.s//<S),and
(x) HiT-'s/) = H(.af)
(The reader should think of the intuitive meaning of each statement. This enables one to remember these results easily.)
Proof. Let c,{s/) = {/I,}, £(<<г?) = {C,}, c,(2) = {Dk} and assume, without loss of generality, that all sets have strictly positive measure (since if £(s/) = {/!,, . . , /Ik} with т(/4;) > 0, 1 < i < r and m(/4f) = 0, r < i < к we can replace i(stf) by {Ai,. ., Ar Ar+, u ■ ■ • u Ak} (see remarks (2), (3) above).
(i) Hi.t/vtf/®) = - X m(Ai n CJ n lo8^‘ •
■ГГ* w(Z)J
Bui
ni(Aj n С j n /J*) /л(/-1, n C'j n Dj.) »j(/I; n Z)k)
m(Dk) m{A; n Dk) т(£>*)
unless m(A; r\ Dk) = 0 and then the left hand side is zero and we need not consider it; and therefore
„ „ m(A: гч Ш H(sfv<$l&) = - X m(y4i n n DJ log-——
■ГГ* «(£>*)
v ^ r\ \, miAinCjn Dk)
-X^nCjr.D,) log т(Лпад-
= - У г Dj log—+ //(y/.t/ v S')
= H(s//S>) + Н(Щлё v S).
(ii) Put ^ = Ж = {^, AT} in (i).
82
4 Lnlropy
(iii) By (i)
H(<6’/Ж) = //(.с/ v Vf'Si) = H(s//9) + H{«/sS v 9) > H(s//9).
(iv) Put 9 = Jf in (iii).
(v) Fix i,j and let
m(Dk n Cj) т(А{ n Dk)
w(Cj) ’ «(D*) '
Then by Theorem 4.2
w(Z)t n Cj) т(Л,- о Z)fc)\ < „ m(Dk nC,j /»пЛ,- n Dj\
V к «i(Cj) m(Dk) у Г m(C^) V '”(£*) / but since (<2 с 9 the left hand side equals
/m{Aj n C;A = nCj) j n C7)
Ф\ m(Cj) m(Cj) °6 m(Cj)
Multiply both sides by m(Cj) and sum over i and j to give
v , л n м m(A‘ n Q ^ v /г, г \ ni(/4-' n i n X mMi n C, log-—J. < Y m(Dk n Cj)----—— log-——
Previous << 1 .. 28 29 30 31 32 33 < 34 > 35 36 37 38 39 40 .. 99 >> Next