Books in black and white
 Books Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

An introduction to ergodic theory - Walters P.

Walters P. An introduction to ergodic theory - London, 1982. - 251 p. Previous << 1 .. 28 29 30 31 32 33 < 34 > 35 36 37 38 39 40 .. 99 >> Next As mentioned above, //(-/) is a measure of the uncertainly removed (or information gained) by performing the experiment with outcomes
{A i.....–õ–∫).
Remarks
(1) If .—Å/ = {–•,—Ñ} then !!(.':‚ñÝ/) = 0. Here —è/ represents the outcomes of a ‚Äúcertain" experiment so there is no uncertainty about the outcome.
(2) If c;(.c/) = ‚ñÝ[/!,, . . ,/4^| where —Ç(–ê() = 1 //< V/ then
* 1 1
/-/(V) = - Y f loS7 = log A'.
1=1 –ª
We shall show later (Corollary 4.2.1) that log/—á is the maximum value for the entropy of a partition with –∫ sets. The greatest uncertainty about the outcome should occur wnen the outcomes are equally likely.
|3) //(.*/)> 0.
(4| If .—Å/ = % then H(.*)= HIV).
(5) If T: X -* X is measure-preserving then H(T~ x .‚Ä¢:/) = //(.;/).
Several properties of entropy are implied by the following elementary result.
\$4.2 Enlropy of a Partition
79
Theorem 4.2. The function <^:[0, oo) -* R defined by
is strictly convex, i.e., —Ñ(–æ—Å—Ö + fly) < –∞—Ñ{—Ö) + –Ý—Ñ(—É) if x, ye [0, oo), a, [1 > 0, a + P = 1; with equality only when x ‚Äî —É or a = 0 or p = 0.
A
if x, e [0, oo), a, > 0, i a, = 1; cind equality holds only when all the xh corresponding to non-zero a‚Äû are equal.
Proof. We have
Fix a, P with a > 0, P > 0. Suppose >' > *‚Ä¢ By the mean value theorem –§(–£) ‚Äî —Ñ(–æ—Å—Ö + Py) = —Ñ'(–≥)–∞(—É ‚Äî x) for some z with ay + Py < z < —É and
—Ñ(–∞—Ö + Py) ‚Äî —Ñ(—Ö) ‚Äî —Ñ'{w)P(y ‚Äî x) for some w with x < w < ax + Py. Since —Ñ" > 0, we have —Ñ'(–≥) > —Ñ\—ã) and hence
Therefore —Ñ("—Ö—Ö 4- Py) < –∞—Ñ(—Ö) + —Ä—Ñ(—É) if x, —É > 0. It clearly holds a'so if
By induction we get
—Ñ'(—Ö) = 1 + logx
—Ñ"(—Ö) = - > 0 on (0, oo).
x
–Ý(–§(—É) - –§(–∞—Ö + –Ý—É)) = –§'^)–∞–Ý(—É - x) > —Ñ'(^)–∞–Ý(—É - x)
= –∞(—Ñ(—Ö + py) - —Ñ(—Ö)).
v, —É > 0 and x –§ —É.
‚ñ°
4 Entropy
Corollar\ 4.2.1. If ¬£ = {/I,, . . then H(i) < log k, and H(c,) = log/—Å only wh.cn ni(A ) = 1 ,'k (or all i.
S'4.3 Conditional Entropy
Conditional entropy is not required in order to give the definition of the entropy of a transformation. It is useful in deriving properties of entropy, and we discuss it now before we consider the entropy of a transformation. Let V, rG be finite sub-c-algebras of .if! and
omitting the j-lerms when m[C,) = 0.
So to gel H(.one considers Cj as a measure space with normauzed measure and calculates the entropy of the partition of the set
C. induced by ¬£(jf) (this gives
and then averages the answer taking into account the size of Cj. (H{slfdt) measures the uncertainty about the outcome of si given that we will be told the outcome of V.)
Let . –ì denote the a-held {—Ñ,–•}. Th^Hitl/jV) = H(sl). (Sincc V represents the outcome of the trivial experiment one gains nothing from knowledge of it.)
Proof. Put a, = l/k and x, = ¬ª¬ª(/!,), 1 < / < k.
‚ñ°
Definition 4.7. The entropy of si given T, is the number
Remarks
(1) /K.d/–ì) > 0.
(2) If.d = 3 then tl(-tf/r4) = H(3/V).
(3) If 9? = 9 then H(s!/V) = –ù–´/–©.
\$4 3 Conditional Entropy
SI
Theorem 4.3. Let (A\/A,m) be a probability space. If s/, r<?, are finite subahjebras of :'A then:
' (i) H{s/ v V/9) = H(s//3) + H(%'jsJ v Si).
(u) Hi.a'vV) = His/) + H(V/sS).
(iii) sJ ¬£ V => –©–ª–ì/Sf) < HW/9).
(iv) sJ ¬£ => H(s/) < H{<#)
(v) <G —Å ¬ª => H(j*/V) > H{sJ/SH>).
(vi) H(stf) > H{sS/k).
(vii) H(sfvV/9) < H(s//9) + H{V/2)
(viii) H{sJ v V) < H{s/) + H(V).
(ix) If T is measure-preserving then:
H{T-ls//T-'<¬£) = H(.s//<S),and
(x) HiT-'s/) = H(.af)
(The reader should think of the intuitive meaning of each statement. This enables one to remember these results easily.)
Proof. Let c,{s/) = {/I,}, ¬£(<<–≥?) = {C,}, c,(2) = {Dk} and assume, without loss of generality, that all sets have strictly positive measure (since if ¬£(s/) = {/!,, . . , /Ik} with —Ç(/4;) > 0, 1 < i < r and m(/4f) = 0, r < i < –∫ we can replace i(stf) by {Ai,. ., Ar Ar+, u ‚ñÝ ‚ñÝ ‚Ä¢ u Ak} (see remarks (2), (3) above).
(i) Hi.t/vtf/¬Æ) = - X m(Ai n CJ n lo8^‚Äò ‚Ä¢
‚ñÝ–ì–ì* w(Z)J
Bui
ni(Aj n –° j n /J*) /–ª(/-1, n C'j n Dj.) ¬ªj(/I; n Z)k)
m(Dk) m{A; n Dk) —Ç(¬£>*)
unless m(A; r\ Dk) = 0 and then the left hand side is zero and we need not consider it; and therefore
‚Äû ‚Äû m(A: –≥—á –® H(sfv<\$l&) = - X m(y4i n n DJ log-‚Äî‚Äî
‚ñÝ–ì–ì* ¬´(¬£>*)
v ^ r\ \, miAinCjn Dk)
-X^nCjr.D,) log —Ç(–õ–ø–∞–¥-
= - –£ –≥ Dj log‚Äî+ //(y/.t/ v S')
= H(s//S>) + –ù(–©–ª—ë v S).
(ii) Put ^ = –ñ = {^, AT} in (i).
82
4 Lnlropy
(iii) By (i)
H(<6‚Äô/–ñ) = //(.—Å/ v Vf'Si) = H(s//9) + H{¬´/sS v 9) > H(s//9).
(iv) Put 9 = Jf in (iii).
(v) Fix i,j and let
m(Dk n Cj) —Ç(–ê{ n Dk)
w(Cj) ‚Äô ¬´(D*) '
Then by Theorem 4.2
w(Z)t n Cj) —Ç(–õ,- –æ Z)fc)\ < ‚Äû m(Dk nC,j /¬ª–ø–õ,- n Dj\
V –∫ ¬´i(Cj) m(Dk) —É –ì m(C^) V '‚Äù(¬£*) / but since (<2 —Å 9 the left hand side equals
/m{Aj n C;A = nCj) j n C7)
–§\ m(Cj) m(Cj) ¬∞6 m(Cj)
Multiply both sides by m(Cj) and sum over i and j to give
v , –ª n –º m(A‚Äò n Q ^ v /–≥, –≥ \ ni(/4-' n i n X mMi n C, log-‚ÄîJ. < Y m(Dk n Cj)----‚Äî‚Äî log-‚Äî‚Äî Previous << 1 .. 28 29 30 31 32 33 < 34 > 35 36 37 38 39 40 .. 99 >> Next 