2Classical gases

II Statistical Physics



2.2 Monoatomic ideal gas
We now begin considering ideal gases.
Definition
(Ideal gas)
.
An ideal gas is a gas where the particles do not interact
with each other.
Of course, this is never true, but we can hope this is a good approximation
when the particles are far apart.
We begin by considering a monoatomic ideal gas. These gases have no
internal structure, and is made up of single atoms. In this case, the only energy
we have is the kinetic energy, and we get
H =
p
2
2m
.
We just have to plug this into our partition function and evaluate the integral.
We have
Z
1
(V, T ) =
1
(2π~)
3
Z
d
3
p d
3
q e
βp
2
/2m
=
V
(2π~)
3
Z
d
3
p e
βp
2
/2m
Here
V
is the volume of the box containing the particle, which is what we obtain
when we do the d
3
q integral.
This remaining integral is just the Gaussian integral. Recall that we have
Z
dx e
ax
2
=
r
π
a
.
Using this three times, we find
Proposition. For a monoatomic gas, we have
Z
1
(V, T ) = V
mkT
2π~
2
3/2
=
V
λ
3
,
where we define
Definition
(Thermal de Broglie wavelength)
.
The thermal de Broglie wavelength
of a gas at temperature T is
λ =
r
2π~
2
mkT
.
If we think of the wavelength as the “size” of the particle, then we see that
the partition function counts the number of particles we can fit in the volume
V
.
We notice that this partition function involves
~
, which is a bit weird since we
are working classically, but we will see that the
~
doesn’t appear in the formulas
we derive from this.
The generalization to multiple particles is straightforward. If we have
N
particles, since the partition function is again multiplicative, we have
Z(N, V, T ) = Z
N
1
= V
N
λ
3N
.
There is a small caveat at this point. We will later see that this is not quite right.
When we think about the quantum version, if the particles are indistinguishable,
then we would have counted each state
N
! times, which would give the wrong
answer. Again, this doesn’t affect any observable quantity, so we will put this
issue aside for the moment, until we get to studying the entropy itself, in which
case this N! does matter.
We can similarly define the pressure to be
p =
F
V
T
=
V
(kT log Z)
T
.
Then plugging our partition function into this definition, we find
p =
NkT
V
.
Rearranging, we obtain
Proposition (Ideal gas law).
pV = NkT.
Notice that in this formula, the
λ
has dropped out, and there is no dependence
on ~.
Definition
(Equation of state)
.
An equation of state is an equation that relates
state variables, i.e. variables that depend only on the current state of the system,
as opposed to how we obtained this system.
The ideal gas law is an example of an equation of state.
Let’s now look at the energy of the ideal gas. We can similarly compute
hEi =
log Z
β
V
=
3
2
NkT = 3N
1
2
kT
.
This is a general phenomenon. Our system has
N
particles, and each particle
has three independent directions it can move in. So there are 3
N
degrees of
freedom.
Law
(Equipartition of energy)
.
Each degree of freedom of an ideal gas contributes
1
2
kT to the average energy.
In the next section, we will study gases with internal structure, hence internal
degrees of freedom, and each such degree of freedom will again contribute
1
2
kT
to the average energy.
Of course, this law requires some hidden assumptions we do not make precise.
If we add a degree of freedom
s
with a term
s
5.3
log
(2
s
+ 1) in the Hamiltonian,
then there is no reason to believe the contribution to the average energy would
still be
1
2
kT
. We will also see in the next section that if the degree of freedom
has some potential energy, then there will be even more contribution to the
energy.
There are other quantities of the gas we can compute. We know the average
energy of a single particle is
hp
2
i
2m
=
3
2
kT.
So we have
hp
2
i mkT.
Thus, for a single particle, we have
|p|
mkT ,
and so
λ
h
|p|
.
This is the usual formula for the de Broglie wavelength. So our thermal de
Broglie wavelength is indeed related to the de Broglie wavelength.
Finally, we can compute the heat capacity
C
V
=
E
T
V
=
3
2
Nk.
Boltzmann’s constant
Recall that Boltzmann’s constant is
k = 1.381 ×10
23
J K
1
This number shows that we were terrible at choosing units. If we were to invent
physics again, we would pick energy to have the same unit as temperature, so
that
k
= 1. This number
k
is just a conversion factor between temperature and
energy because we chose the wrong units.
But of course, the units were not chosen randomly in order to mess up our
thermodynamics. The units were chosen to relate to scales we meet in everyday
life. So it is still reasonable to ask why
k
has such a small value. We look at the
ideal gas law.
pV
T
= Nk.
We would expect when we plug in some everyday values for the left hand side,
the result would be somewhat sensible, because our ancestors were sane when
picking units (hopefully).
Indeed, we can put in numbers
p = 10
5
N m
2
V = 1 m
3
T = 300 K,
and we find that the LHS is 300.
So what makes
k
such a tiny number is that
N
is huge. The number of
particles is of the order 10
23
. Thus, for Nk to have a sensible value, k must be
tiny.
The fact that
k
is small tells us that everyday lumps of matter contain a lot
of particles, and in turn, this tells us that atoms are small.
Entropy
The next thing to study is the entropy of an ideal gas. We previously wrote
down
Z = Z
N
1
,
and briefly noted that this isn’t actually right. In quantum mechanics, we know
that if we swap two indistinguishable particles, then we get back the same state,
at least up to a sign. Similarly, if we permute any of our particles, which are
indistinguishable, then we get the same system. We are over-counting the states.
What we really should do is to divide by the number of ways to permute the
particles, namely N!:
Z =
1
N!
Z
N
1
.
Just as in the constant
h
in our partition function, this
N
! doesn’t affect any of
our observables. In particular,
p
and
hEi
are unchanged. However, this
N
! does
affect the entropy
S =
T
(kT log Z).
Plugging the partition function in and using Stirling’s formula, we get
S = Nk
log
V
Nλ
3
+
5
2
.
This is known as the Sackur-Tetrode equation.
Recall that the entropy is an extensive property. If we re-scale the system by
a factor of α, then
N 7→ αN, V 7→ αV.
Since
λ
depends on
T
only, it is an intensive quantity, and this indeed scales
as
S 7→ αS
. But for this to work, we really needed the
N
inside the logarithm,
and the reason we have the
N
inside the logarithm is that we had an
N
! in the
partition function.
When people first studied statistical mechanics of ideal gases, they didn’t
know about quantum mechanics, and didn’t know they should put in the
N
!.
Then the resulting value of
S
is no longer extensive. This leads to Gibbs paradox .
The actual paradox is as follows:
Suppose we have a box of bas with entropy
S
. We introduce a partition
between the gases, so that the individual partitions have entropy
S
1
and
S
2
.
Then the fact that the gas is not extensive means
S 6= S
1
+ S
2
.
This means by introducing or removing a partition, we have increased or decreased
the entropy, which violates the second law of thermodynamics.
This
N
!, which comes from quantum effects, fixes this problem. This is a
case where quantum mechanics is needed to understand something that really
should be classical.
Grand canonical ensemble
We now consider the case where we have a grand canonical ensemble, so that
we can exchange heat and particles. In the case of gas, we can easily visualize
this as a small open box of gas where gas is allowed to freely flow around. The
grand ensemble has partition function
Z
ideal
(µ, V, T ) =
X
N=0
e
βµN
Z
ideal
(N, V, T )
=
X
N=0
1
N!
e
βµ
V
λ
3
N
= exp
e
βµ
V
λ
3
Armed with this, we can now calculate the average number of particles in our
system. Doing the same computations as before, we have
N =
1
β
log Z
µ
V,T
=
e
βµ
V
λ
3
.
So we can work out the value of µ:
µ = kT log
λ
3
N
V
.
Now we can use this to get some idea of what the chemical potential actually
means. For a classical gas, we need the wavelength to be significantly less than
the average distance between particles, i.e.
λ
V
N
1/3
,
so that the particles are sufficiently separated. If this is not true, then quantum
effects are important, and we will look at them later. If we plug this into the
logarithm, we find that µ < 0.
Remember that µ is defined by
µ =
E
N
S,V
.
It might seem odd that we get energy out when we add a particle. But note that
this derivative is taken with
S
fixed. Normally, we would expect adding a particle
to increase the entropy. So to keep the entropy fixed, we must simultaneously
take out energy of the system, and so µ is negative.
Continuing our exploration of the grand canonical ensemble, we can look at
the fluctuations in N, and find
N
2
=
1
β
2
log Z
ideal
= N.
So we find that
N
N
=
1
N
0
as N . So in the thermodynamic limit, the fluctuations are negligible.
We can now obtain our equation of state. Recall the grand canonical potential
is
Φ = kT log Z,
and that
pV = Φ.
Since we know log Z, we can work out what pV is, we find that
pV = kT
e
βµ
V
λ
3
= NkT.
So the ideal gas law is also true in the grand canonical ensemble. Also, from
cancelling the
V
from both sides, we see that this determines
p
as a function of
T and µ:
p =
kT e
βµ
λ
3
.