0% found this document useful (0 votes)
11 views22 pages

Chapter 1

The document discusses statistical physics, focusing on the macroscopic and microscopic systems, and the application of probability theory to understand their behavior. It introduces key concepts such as random experiments, probability axioms, and the binomial distribution, particularly in the context of the Random Walk Problem. Additionally, it covers mean values and dispersion, emphasizing their significance in analyzing probability distributions.

Uploaded by

adamuzewudu672
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views22 pages

Chapter 1

The document discusses statistical physics, focusing on the macroscopic and microscopic systems, and the application of probability theory to understand their behavior. It introduces key concepts such as random experiments, probability axioms, and the binomial distribution, particularly in the context of the Random Walk Problem. Additionally, it covers mean values and dispersion, emphasizing their significance in analyzing probability distributions.

Uploaded by

adamuzewudu672
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Chapter one

Features of Macroscopic Systems and basic probability


concept

Introduction
• Statistical physics is the study of the collective behavior of large assemblies of
atoms and molecules using probabilistic reasoning.
• Statistical physics is a branch of physics that uses probability theory
and statistics to solve physical problems that usually involve systems composed
of a large number of units.
• Its main purpose is to study the properties of a system from
the statistical behavior of its components.
• Statistical Mechanics
– Use Probability & Statistics to calculate macroscopic properties from
microscopic force laws.
– Applies to both the Classical & the Quantum worlds!
– It links between microscopic & macroscopic physics
– Contains Thermodynamics as a sub-theory! 1
Microscopic and Macroscopic Systems
• It is useful, at this stage, to make a distinction between the different sizes of the
systems that we are going to examine.
• We shall call a system microscopic if it is roughly of atomic dimensions, or
smaller.
• On the other hand, we shall call a system macroscopic if it is large enough to be
visible in the ordinary sense.
• This is a rather inexact definition. The exact definition depends on the number of
particles in the system, which we shall call N.
• A system is macroscopic if <<1, which means that statistical arguments can be
applied to reasonable accuracy.
• For instance, if we wish to keep the relative statistical error below one percent then
a macroscopic system would have to contain more than about ten thousand
particles.
• Any system containing less than this number of particles would be regarded as
essentially microscopic, and, hence, statistical arguments could not be applied to
such a system without unacceptable error.

2
• Microscopic: ~ Atomic dimensions ~ ≤ a few Å
• Macroscopic: Large enough to be “visible” in the “ordinary” sense

• An Isolated System is in Equilibrium when it’s Macroscopic parameters are time-


independent. This is the usual case in this course!
• A macroscopic system can be characterized by a few macroscopic variables, e.g.
energy, volume, pressure, temperature, etc, and the dynamics of the system is then
obtained in terms of these macroscopic variables.
• The macroscopic variables depend too little on the exact 'microstate' of the system.

3
Probability

• Probability deals with experiments/any process of observations that may result


in random values
• An experiment is called a random experiment if its outcome cannot be predicted.
Typical examples of a random experiment are
– the roll of a die,
– the toss of a coin,
• The set of all possible outcomes of a random experiment is called the sample
space (or universal set), and it is denoted by S.
S={1, 2, 3, 4, 5, 6} for a die
S = {H, T} for a coin
• An element in S is called a sample point. Each outcome of a random experiment
corresponds to a sample point.
• we suppose that for each event E of an experiment having a sample space S there
is a number, denoted by P(E ), probability.
 For instance, if we toss a die, there are six possible outcomes [1,2,3,4,5,6].
 It is equally probable that any number from one to six will come up.
 Hence, the probability of any face coming up is the same, p(any face) , 1/6.
4
 Axioms of probability theory
 AXIOM 1: Probability is positive number or zero
 AXIOM 2: Probability is less than or equal to one

 AXIOM 3: For any sequence of mutually exclusive events E1, E2, . . .

 If a given outcome can be reached in two (or more) mutually


exclusive( an event occurs, the other event does not occur) ways
whose probabilities are pA and pB, then the probability of that
outcome is: pA + pB.
 This is the probability of having either A or B.
 If a given outcome represents the combination of two independent
events, whose individual probabilities are pA and pB, then the
probability of that outcome is: pA × pB.
 This is the probability of having both A and B
5
Example
 Paint two faces of a die red. When the die is thrown, what is
the probability of a red face coming up?
1 1 1
p  
6 6 3
 Throw two normal dice. What is the probability of two sixes
coming up?
1 1 1
p ( 2)   
6 6 36

6
The Random Walk & The Binomial Distribution

• To quantitatively introduce probability concepts, we use a


specific, simple example, which is actually much more general
than you first might think. This example is called The Random
Walk Problem

7
Why we study random walk?
• Physical Examples to which the Random Walk Problem applies
– Magnetism: N atoms, each with magnetic moment μ. Each has spin ½. By Quantum
Mechanics, each magnetic moment can point either “up” or “down”. If these are equally
likely, what is the Net magnetic moment of the N atoms?
– Diffusion of a Molecule of |Gas
– A molecule travels in 3 dimensions with a mean distance ℓ between collisions. How
far is it likely to have traveled after N collisions? Answer using Classical Mechanics.
 The probability of stepping to the right is p & of stepping to the left is q = 1 – p. In
general, q ≠ p.
 Each time the man takes a step, the probability of its being to the right is p, while the
probability of its being to the left is q = 1 - p. Note p # q.
• Let n1 denote the number of steps to the right and n2 the corresponding number of steps to
the left.
• The total number of steps N is simply
N = n1+n2
• After the man has taken N steps, what is the probability of his being located at the position x
= ml?

-N ≤ m ≤ N
• where m is an integer lying between

8
Counting the number of events

• Out of three steps


– All of them right
• Out of three
– Two right and one left
• Out of three
– two left and one right
• Out of three
• etc
The number of Distinct ways
 n1 =3 and n2 =0 1
 n1 =2 and n2=1 3
 n1 = 1 and n2 =2 3
 n1 = 0 and n2 = 3 1
9
• The results can be generalized for any N total steps where n 1 is
right and n2 to the left, the number of distinct ways is
(coefficient of Binomial
N ! expansion)
n1! n2 !
• Now, the probability of any one given sequence of n1 steps to
the right and n2, steps to the left is given simply by multiplying
the respective probabilities,

• Let w(n1) be the probability of taking n1 steps to the right and


n2 steps to the left.
• Then, that probability could be expressed

• This probability function is called the binomial distribution.


• The reason is that it represents a typical term encountered in
expanding (p + q)N by the binomial theorem. 10
• We recall that the binomial expansion is given by the formula

• The man has performed n1 steps to the right in a total of N steps, which is

 Then its net displacement m from the origin is determined.

• Thus the probability PN(m) that the particle is found at position m after N
steps is the same as wN( n1) given by
 Using
• Substitution of these relations in () thus yields

• In the special case where p = q = ½, this assumes the symmetrical


form

11
• Figure (below) illustrates the binomial distribution for the
same case where p = q = 1/2, but with the total number of.
steps N = 20.
• The envelope of these discrete values of is a bell-shaped
curve.--- Binomial distribution, Gaussian distribution

 Binomial probability distribution for p = q = ½ when N = 20 steps.


 The graph shows the probability WN(n,) of n, right steps, or PN(m)
 Equivalently the probability PN(m) of a net displacement of m units to the right. 12
• As another example, suppose that: p = q = ½, N = 20.
P20(m) = {20!/[0.5(20 + m)!][0.5(20 - m)!](½)3
 What is the physical significance of this kind of plot?

• The significance of this is that,


after N random steps, the probability of
a particle being a distance of N steps
away from the start is very small & the
probability of it being at or near the
origin is relatively large:

P20(20) = [20!/(20!0!)](½)20
P20(20)  9.5  10-7
P20(0) = [20!/(10!)2](½)20
P20(0)  1.8  10-1
13
 The physical significance of this plot is that it shows that
 After N random steps, the probability of the particle being a distance
of N steps away from the origin is very small,
 While the probability of its being located in the vicinity of the origin is
largest.
 In general, this kind of plot will give us the most probable out
comes(mean) and some others like(dispersion)
• Some simple mean values are particularly useful for
describing characteristic features of the probability
distribution P.
• One of the application of the mean value (e.g, the mean grade
of a class of students).

14
Mean Values
• The Binomial Distribution is only one example of a probability
distribution. Now, we’ll begin a discussion of a General Distribution.
• Most of the following is valid for any probability distribution. Let u be a
variable which can assume any of the M discrete values.
• And the respective probabilities be

• The mean (or average) value of u is denoted by and is defined by

• This is the so-called normalization condition satisfied by every


probability.

15
• If one measures the values of u from their mean value , i.e.,

• The quantity Δu ≡ u - ū (deviation from the mean). This result demonstrate how
u deviates from the mean and it will be positive or negative, However, its
mean is

• This says merely that the mean value of deviation from the mean vanishes.
– The mean value of the deviation from the mean is always zero!
• If we make it square, it may not be negative and it represents another
result/information
• Now, look at (Δu)2 = (u - <u>)2 (square of the deviation from the mean). It’s mean
value is: <(Δu)2> = <(u - <u>)2> = <u2 -2uū – (ū)2>

= <u2> - 2<u><u> – (<u>)2 = <u2> - (<u>)2


• This is called the “Mean Square Deviation” (from the mean). It is also called several
different (equivalent!) other names:
– The Dispersion or
– The Variance or
– the 2nd Moment of P(u) about the mean. 16
Dispersion

• That quantity is called the second moment of u about its mean or more simply the
dispersion of u.
=
• This can never be negative, since 0 so that each term in the sum contributes a
nonnegative number.
• Only if ui = for all values ui will the dispersion vanish. The larger the spread of
values of ui about , the larger the dispersion.
• The dispersion thus measures the amount of scatter of values of the variable about
its mean value (e.g., scatter in grades about the mean grade of the students).
• (Δu)2> is a measure of the spread of the u values about the mean ū. See the
following figures

• The curve becomes steeper and higher at the mean as dispersion gets smaller
17
 The following general relation, which is often useful in computing the dispersion:
=
 Since the left side must be positive it also follows that
 A knowledge of some mean values is not sufficient to determine P(u) completely
(unless one knows the moments for all values of n).
• We could also define the nth moment of P(u) about the mean:
<(Δu)n> ≡ <(u - <u>)n>
• This is rarely used beyond n = 2.
• A knowledge of the probability distribution function P(u) gives complete
information about the distribution of the values of u. But, a knowledge of only a
few moments, like knowing just ū & <(Δu)2> implies only partial, though useful
knowledge of the distribution.
• A knowledge of only some moments is not enough to uniquely determine P(u).
• In order to uniquely determine a distribution P(u), we need to know ALL moments
of it. That is we need all moments for
n = 0,1,2,3….  .

18
Calculation of mean values for the random walk problem
• we have found that the probability, in a total of N steps, of making n 1 steps to the
right (and n2 = N – n1 steps to the left) is

• Verify the normalization condition which says that the probability of making any
number of right steps between 0 and N must be unity.
=1
• What is the mean number of steps to the right?

• Let us consider the purely mathematical problem of evaluating the sum occurring,
where p and q are considered to be any two arbitrary parameters.
• Then one observe that the extra factor nl can be produced by differentiation, so that

19
 For large N, Hence the mean becomes

 Clearly, the mean number of left steps is similarly equal to

 The displacement (measured to the right in units of the step length l)


is m = nl - n2.
 Hence we get for the mean displacement

= N(p-q)

 If p = q, then 0.
20
Calculation of the dispersion
• Let us now calculate the dispersion. By definition one has
=
 We already know . Thus we need to compute . . Using the same techniques as we do for ,
one can obtains = + Npq
 Hence the dispersion of nl gives us the following result
• The ms (root-mean-square) deviation linear measure of the width of the range over
which n1, is distributed.
• The root mean square (rms) deviation from the mean is defined as:
(Δ*n1)  [<(Δn1)2>]½ (in general).
• For the binomial distribution this is

(Δ*n1) = [Npq]½  The distribution width


• Again note that: <n1> = Np. So, the relative width of the distribution is:

(Δ*n1)/<n1> = [Npq]½(Np) = (q½)(pN)½

• If p = q, this is:

(Δ*n1)/<n1> = 1(N)½ = (N)-½

• As N increases, the mean value increases  N but 21


the relative width decreases  (N)-½
• One can also compute the dispersion of m, i.e., the dispersion of the net
displacement to the right.

 Hence one obtains

 Square and average it

• In particular, for p = q = 1/2,

Exercise
• Consider the case of N = 100 steps, where p = q = 1/2.
• Then what is the mean number of steps to the right ?
• what is the mean number of steps to the left?
• what is the mean displacement ?

22

You might also like