in black and white
Main menu
Share a book About us Home
Biology Business Chemistry Computers Culture Economics Fiction Games Guide History Management Mathematical Medicine Mental Fitnes Physics Psychology Scince Sport Technics

Advances in chemical physics - Prigogine I.

Prigogine I., Rice S.A. Advances in chemical physics - Advision, 1970. - 167 p.
Download (direct link): advecioninphysical1970.djvu
Previous << 1 .. 13 14 15 16 17 18 < 19 > 20 21 22 23 24 25 .. 51 >> Next

Define the property,
= i
k= 1
trpoc+„ocn= t Z\trpU+(tk)U(tj)Zj>0 jk=l
Thus Ф(х) satisfies the condition of Bochner’s theorem so that there exists a probability density or spectral density i?(co) such that
+ °° d(o R((o) еш (170h)
— oo
where 0 < R(со) < 1.
Ф(х) = j
From Bochner’s theorem it is seen that power spectra are everywhere positive and bounded, and furthermore, time-correlation functions have power spectra that can be regarded as probability distribution functions.
The Wiener-Khinchin theorem is a special case of Bodmer theorem applicable to time averages of stationary stochastic variables. Bochner’s theorem enables the Wiener-Khinchin theorem to be applied to ensemble averaged time-correlation functions in quantum mechanics where it is difficult to think of properties as stochastic processes.
The power spectrum G(co) of the normalized time-correlation function, C{t), like any distribution function, can be decomposed into a continuous and a discrete part, G£со) and Gd(со), respectively:
<j(co) = Crd(co) + <jc(co) (170i)
The discrete part is of the form
Gd(®) = £ i\5(co - cok), к = 1,... (170j)
Here {cok} is a denumerable set of frequencies and {PJ is the set of corresponding probabilities (0 < Pk < 1 and 0 < £ fcPk < 1). It is assumed here that the continuous part of the spectrum, Gc(co), is a continuous well-behaved function of the frequency, although it is quite possible to find physical <jc(co) which have singular points. From previous chapters it follows that Cr(co) is even in со.
The normalized time-correlation function can thus be decomposed in a corresponding way,
C(t) = f + °°</(о Gd(co) ei(ot + £ P* cos cok t (170k)
J - oo к
In this case C{t) does not have any long-time limit. If the spectrum is entirely continuous, then it follows from the lemma of Riemann-Lebesque that C{t) vanishes as t-> oo. A system is irreversible if and only if all time correlation functions of properties t) (with zero mean) vanish as t-+ oo. Consequently, irreversible systems must have continuous spectra. In finite isolated systems, the spectrum is discrete and
C(t) = ХРк cos cokf (1701)
is almost periodic. This is a consequence of Poincare’s theorem. In specialized cases it can be shown that in the thermodynamic limit, TV -*■ oo, V-*■ oo such that N/V = const, the discrete spectrum becomes continuous. Irreversibility enters in an asymptotic manner. This is a very important point.
Computer experiments on condensed media simulate finite systems and moreover use periodic boundary conditions. The effect of these boundary conditions on the spectrum of different correlation functions is difficult to assess. Before the long-time behavior of covariance functions can be studied on a computer, there are a number of fundamental questions of this kind that must be answered.
R(t) is the characteristic function of the probability distribution
P(co) = (170m)
v ' (UjU)n ’
The moments of P(co) are consequently
<co"> = f + °°</co"P(co) = [(UlUynr1 ®"*'(©)
^ — oo — сзо
From Eqs. (158), (160), and (162) it should be noted that these moments can be related to the sum rules on ЛГ'(со), and that furthermore
where ц0, \i2> and ц4 are the first few sum rules on К'((й). The first condition follows from the fact that К'{а>) is an even function of со.
It is often a very complicated problem to compute К'((а) for a given many-body system. We have devised an approximate method for finding P{со). For this purpose we define the information measure of a distribution as
6*[P(co)] = - f + °°c/co P(co) ln P(co) (170o)
^ — oo
The measure »S[P(co)] is called the entropy corresponding to the distribution P(a>). According to information theory, if a certain set of moments of P(co) are known, that P(co) is optimum which maximizes £[Р(со)] subject to the moment constraints. Suppose we know only
<®°> = i
<m2> = ^ (170p)
(a) <co2"+1> = 0
0b) <co°> = 1
(c) <co2> = ^
(.d) <®4> = ^
Then we must find that P{со) for which
8ST(co)] = - 5 f + °°</co P(co) In P(со) = 0 (170q)
* — CO
8 f + °°</coP(co) = 0
* — 00
8 f dco co2P(co) = 0
^ — 00
are satisfied. This problem can be solved using Lagrange multipliers. The optimum P(co) turns out to be
1/2 / .. ,л2\
P{ C0) =
/ Ho co2\
I 2^2 )
Since /£(f) is the characteristic function of the distribution it follows that
K.(t) =1* da P(со) e*
J — ЛП
Information theory consequently leads to the normalized memory function which is a Gaussian function of the time
From which it follows that the memory function K(t) is
K(t) = (U\U) exp — ^
This approximation will be very useful in the following sections. It should be noted that higher-order moments could have been used to generate higher-order approximations.
Previous << 1 .. 13 14 15 16 17 18 < 19 > 20 21 22 23 24 25 .. 51 >> Next