04.08.2013 Views

Statistical Mechanics - Physics at Oregon State University

Statistical Mechanics - Physics at Oregon State University

Statistical Mechanics - Physics at Oregon State University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1.3. STATES OF A SYSTEM. 5<br />

4. Take the thermodynamic limit and compare with experiment/thermodynamics.<br />

Collective versus random behavior.<br />

For example, if one could solve all equ<strong>at</strong>ions of motion, one would find ri(t)<br />

and pi(t) for all particles i = 1, · · · , N. It is then easy to separ<strong>at</strong>e collective<br />

motion and random motion:<br />

pi(t) = pav(t) + δpi(t) (1.3)<br />

Energy rel<strong>at</strong>ed to this average motion can easily be retrieved. For example,<br />

the energy stored in moving air can be used to drive a windmill. The energy<br />

stored in the random components δpi(t) of the motion is much harder to get out.<br />

He<strong>at</strong> engines have a well defined upper limit (Carnot limit) to their efficiency!<br />

Also, the amount of d<strong>at</strong>a needed to describe these random components is much<br />

too large. This is where entropy and temper<strong>at</strong>ure play a role. In some sense,<br />

entropy measures the amount of randomness in a system.<br />

The energy stored in the random components of the motion, δpi(t) is rel<strong>at</strong>ed<br />

to thermal energy. It is hard to recover, it is hard to measure. The only<br />

viable description we have available is a st<strong>at</strong>istical one. we therefore define<br />

entropy as a measure of this randomness. Entropy measures everything we do<br />

not know easily, or therefore everything we do not know. Entropy measures<br />

non-inform<strong>at</strong>ion.<br />

Entropy first!<br />

In st<strong>at</strong>istical mechanics we define entropy first, and then derive temper<strong>at</strong>ure.<br />

This is the opposite from the situ<strong>at</strong>ion in thermodynamics. Randomness<br />

involves the number of st<strong>at</strong>es available for a system, and st<strong>at</strong>es are much easier<br />

to count using quantum mechanics. This is the basic reason why quantum<br />

st<strong>at</strong>istical mechanics is so much easier than classical st<strong>at</strong>istical mechanics. In a<br />

quantum mechanical description we have to count numbers of st<strong>at</strong>es, in classical<br />

mechanics we have to integr<strong>at</strong>e over regions of phase space, which is harder to<br />

describe exactly. Ideas similar to entropy are used in different fields like inform<strong>at</strong>ion<br />

theory where the amount of inform<strong>at</strong>ion in a message can also be rel<strong>at</strong>ed<br />

to a concept like entropy. In general, any system which contains many degrees<br />

of freedom which we cannot know and are not interested in can be described<br />

using a concept similar to entropy for these degrees of freedom!<br />

1.3 St<strong>at</strong>es of a system.<br />

Ising model.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!