04.08.2013 Views

Statistical Mechanics - Physics at Oregon State University

Statistical Mechanics - Physics at Oregon State University

Statistical Mechanics - Physics at Oregon State University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

4.4. DEGENERATE GAS. 79<br />

<br />

<br />

n<br />

F = NkBT log( ) − 1 + Fint(T, N) (4.49)<br />

nQ(T )<br />

Fint(T, N) = −NkBT log Zint(T ) (4.50)<br />

Experimental access to the internal degrees of freedom.<br />

For the entropy and internal energy we also find formulas th<strong>at</strong> show th<strong>at</strong><br />

these quantities are the sum of the entropy/energy for the motion of the center<br />

of mass plus a term pertaining to the internal degrees of freedom. Since we<br />

have assumed th<strong>at</strong> the energy of the internal motion does not depend on other<br />

particles, Fint does not depend on volume and the ideal gas law pV = NkT<br />

remains valid! The he<strong>at</strong> capacities do change, however, since the internal energy<br />

is different. Therefore, the value of the r<strong>at</strong>io γ is different. As we will<br />

see l<strong>at</strong>er, the value of this r<strong>at</strong>io is a direct measure of the number of active<br />

degrees of freedom. Hence by measuring the pressure-volume rel<strong>at</strong>ion in adiab<strong>at</strong>ic<br />

expansion gives us direct inform<strong>at</strong>ion about the internal structure of the<br />

molecules!<br />

Do we have them all?<br />

The entropy is the sum of the entropy associ<strong>at</strong>ed with the center of mass<br />

motion and the entropy of the internal st<strong>at</strong>e of the molecules. Hence the entropy<br />

is now larger than the entropy of mono-<strong>at</strong>omic molecules. We need more<br />

inform<strong>at</strong>ion to describe the st<strong>at</strong>e of large molecules, and hence we have more<br />

”un-knowledge”, which transl<strong>at</strong>es into a larger entropy.<br />

How do we know if we have all internal degrees of freedom accounted for?<br />

In principle, we do not know, because there could always be hidden degrees of<br />

freedom. For example, do we have to separ<strong>at</strong>e electrons and nucleï? Neutrons<br />

and protons? Quarks? Hence the real value of the entropy could even be larger<br />

as wh<strong>at</strong> we calcul<strong>at</strong>ed here. We can also ask the question from an experimental<br />

point of view. In th<strong>at</strong> case, the answer is simple, cool the system down to a low<br />

temper<strong>at</strong>ure. But wh<strong>at</strong> is low? Sometimes we see phase transitions <strong>at</strong> very low<br />

temper<strong>at</strong>ures, and we are never sure if we went low enough. For example, the<br />

nuclear spins could order below a certain temper<strong>at</strong>ure, and the values of such<br />

temper<strong>at</strong>ures are small indeed.<br />

4.4 Degener<strong>at</strong>e gas.<br />

The entropy of an ideal or Boltzmann gas is given by the Sackur-Tetrode formula.<br />

The temper<strong>at</strong>ure enters this equ<strong>at</strong>ion through the quantum concentr<strong>at</strong>ion.<br />

In the limit T → 0, the entropy is approxim<strong>at</strong>ely 3<br />

2NkB log(T ) and approaches<br />

−∞, which is obviously incorrect. The Sackur-Tetrode formula is not

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!