Statistical Mechanics - Physics at Oregon State University
Statistical Mechanics - Physics at Oregon State University
Statistical Mechanics - Physics at Oregon State University
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
1.1. INTRODUCTION. 3<br />
gas constant R. In st<strong>at</strong>istical mechanics we use N for the number of particles<br />
in the system. The chemical potential is now an energy per particle, and the<br />
equ<strong>at</strong>ions contain the Boltzmann constant kB. We have the following simple<br />
rel<strong>at</strong>ion, R = NAkB, where NA is Avogadro’s number, the number of particles<br />
in a mole.<br />
If we want to change the volume of a system we have to apply a pressure<br />
p. The work done during a volume change is of the form p∆V . In equilibrium,<br />
the pressure is the same in the whole system, unless other forces are present. If<br />
we take a system in equilibrium and separ<strong>at</strong>e it into two parts, both parts will<br />
have the same pressure. Therefore, the pressure is not additive and it is called<br />
an intensive parameter. There is an intensive parameter corresponding to most<br />
extensive parameters (the energy, an extensive parameter, is the exception).<br />
For example, the chemical potential µ is a partner of N, the magnetic field H<br />
pairs with M, etc. These intensive parameters correspond in many cases to the<br />
external handles we have on a system and which can be used to define the st<strong>at</strong>e<br />
of a system. The energy contains terms like pV , µN, and H · M.<br />
How is temper<strong>at</strong>ure defined?<br />
In thermodynamics we define the temper<strong>at</strong>ure T oper<strong>at</strong>ionally, by how we<br />
can measure it. It clearly is an intensive parameter. The corresponding extensive<br />
parameter is called the entropy S and the energy contains a term of the form<br />
T S. The transport of this form of energy is called he<strong>at</strong>. Entropy cannot be<br />
measured directly like volume or number of particles, and this is often seen<br />
as a major problem in understanding entropy. But there are other extensive<br />
quantities th<strong>at</strong> are often measured indirectly too! For example, the value for<br />
a magnetic moment follows from the response of a magnetic system to a small<br />
change in magnetic field. Measuring a magnetic moment this way is identical to<br />
measuring the entropy by submitting a system to a small change in temper<strong>at</strong>ure.<br />
It is true, however, th<strong>at</strong> for all extensive quantities except entropy we are able to<br />
find some simple physical macroscopic picture which enables us to understand<br />
the meaning of th<strong>at</strong> quantity.<br />
Wh<strong>at</strong> is the real entropy?<br />
Entropy is associ<strong>at</strong>ed with randomness or chaos, and these concepts are<br />
harder to put into a simple picture. Nevertheless, this is the p<strong>at</strong>h followed in<br />
st<strong>at</strong>istical mechanics. Entropy is defined first, and then temper<strong>at</strong>ure simply<br />
follows as the intensive st<strong>at</strong>e variable conjug<strong>at</strong>e to the entropy. But how to<br />
define entropy is not clear, and there are several approaches. In a technical<br />
sense, the quantities defined in st<strong>at</strong>istical mechanics are only entropy analogues.<br />
For each proposed definition one needs to show th<strong>at</strong> the resulting equ<strong>at</strong>ions<br />
are equivalent to those derived in thermodynamics, which in the end describes<br />
experimental reality. We will only be able to do so in the thermodynamic limit,<br />
where the system becomes large. Also, we need to keep in mind th<strong>at</strong> we always