0% found this document useful (0 votes)
60 views

University of Central Florida - Department of Physics

1) The lecture discusses the fundamental principles of statistical physics, including the statistical matrix and concepts like the microcanonical distribution and entropy. 2) It explains that entropy is defined up to a constant and represents the logarithm of the number of microscopic configurations corresponding to a given macroscopic state. Entropy measures the probability of a system being in any of its microscopic states. 3) The lecture also covers how entropy reaches its maximum value in equilibrium as the system transitions between states, relating to the second law of thermodynamics. While the underlying mechanical equations are time-reversible, the time direction is not reversible in quantum mechanics due to interactions, which must be connected to the increase in entropy.

Uploaded by

Firoz Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views

University of Central Florida - Department of Physics

1) The lecture discusses the fundamental principles of statistical physics, including the statistical matrix and concepts like the microcanonical distribution and entropy. 2) It explains that entropy is defined up to a constant and represents the logarithm of the number of microscopic configurations corresponding to a given macroscopic state. Entropy measures the probability of a system being in any of its microscopic states. 3) The lecture also covers how entropy reaches its maximum value in equilibrium as the system transitions between states, relating to the second law of thermodynamics. While the underlying mechanical equations are time-reversible, the time direction is not reversible in quantum mechanics due to interactions, which must be connected to the increase in entropy.

Uploaded by

Firoz Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

University of Central Florida Department of Physics

Physics 5524 Statistical Physics


Spring 2016

Lecture 2, January 14
The fundamental principles of statistical physics
The statistical matrix.
The state can be regarded as stationary if its energy indefiniteness

is much smaller than

the distance between the levels. Since for the macroscopic bodies the last are very small, there is
no stationary state in the macroscopic system.

In quantum statistics,

c c

c c

,
.

c c , or

. The

logarithm of the distribution function of a subsystem a is log

.
The microcanonical distribution for the number of quantum states in different subsystems a
is:

microcanonical distribution
(probability to find the system in any of the
states).

Entropy.
Probability W(E)dE that the subsystem n will be in the states with energy [E,E+dE] is the
w E multiplied by the number of states d E
distribution function for the subsystem w
dE. Therefore, the energy probability distribution is W E
W E dE
where

1, and since W(E) has a sharp peak at E

w E . One should have

E, one has W E E

w E

1,

E is the number of states in the interval E (degree of broadening of the

macroscopic state, or the mean fluctuation of the energy of the subsystem). This is similar to the
classical statistics, where E pq 1. In the passing to the quasi-classical case from the
quantum statistics one puts

, where s is the number of degrees of the system.


Since pq is not dimensionless, for the logarithm argument to be dimensionless, one must
divide pq by some a, which has the same dimensions as pq. Thus the entropy S log is

log a. The meaning has difference of the entropies at different

defined up to a constant

processes, not the entropy itself. Therefore, the denominator in


choice.
E , log w
Since log w E
Therefore, the entropy S log
log w E , or

can be written as average value of log w E .


log w
(from w E 1) can be written as

In classical physics,

log 2

is a meaningful


log 2

.
.

- for the whole system.

In partial equilibrium, to calculate the entropy, a quantity that describes average properties of the
body, the subsystems should be considered during times , much larger than the relaxation
times for each subsystem (smaller subsystem shorter the relaxation time). At shorter times, the
entropy does not have meaning.
Consider
E

the

distribution

dw

constant

constant

dE . Functions and S are the functions of E . However, formally thaey can be

regarded as functions of E . Then, in the last equation one can put

. Then,

putting
in the last equation for dw, one gets dw

constant

. The exponent

is a

very rapidly changing function of the energies of the subsystems E . Comparing to this function,
the energy dependence of

is very weak and can be regarded as a constant. Therefore,

,
-

the probability of the system to have energies of the subsystems


in the intervals
is defined by the entropy.
,

The most probable values of E are E . Thus, for given E


E the entropy S E , E , has
the maximum when E
E (the largest probability dw).
The entropy of the closed system in a static equilibrium has its largest possible value.
Since for a closed system,
between the levels is

is the number of states in the interval E. Then the distance

- the entropy defined the density of levels in the macroscopic system.


Since the entropy is an additive quantity, the mean distance between the levels exponentially
decreases with increasing the system size.
The law of increase entropy.
When going to the equilibrium, the system passes different states, the corresponding entropy
increases (going to more probable states) and reaches the maximum when the system reaches
equilibrium law of increase if entropy (second law of thermodynamics).
The mechanical equations are invariant to . Despite the fact that the entropy increases (not
decreases as it should be in the reverse motion, ), the processes with decreasing entropy are
possible, but have very low probability.
In quantum mechanics the time reversal is related to conjugation of the Schroedinger equation,
and hence the wave function. Change from to and from
to
is reversible: if

at , then
at . However, in quantum mechanics interaction B with an object
that happens after interaction A with the object is defined by A. In this sense, the time direction is
not reversible. This nonequivalence must be related to , but it is not known so far.
The physical reason for increase of entropy is unknown (may be related to theory of relativity).
The entropy may either increase or remain constant irreversible and reversible processes.

You might also like