Tag Archives: statistical physics

Integrating Differentials in Thermodynamics

“My bad!”, he said.

I’ve just realised I made a mistake when teaching my statistical physics course last term. Fortunately it was a minor and careless maths mistake, rather than any lack of physics clarity. But it’s time to set the record straight!

Often in thermodynamics you will derive equations in differential form. For example, you might be given some equations of state and asked to derive the entropy of a system using the first law

\displaystyle dE = TdS - pdV

My error pertained to exactly such a situation. My students had derived the equation

\displaystyle dS = (V/E)^{1/2}dE+(E/V)^{1/2}dV

and were asked to integrate this up to find S. Naively you simply integrate each separately and add the answers. But of course this is wrong! Or more precisely this is only correct if you get the limits of integration exactly right.

Let’s return to my cryptic comment about limits of integration later, and for now I’ll recap the correct way to go about the problem. There are four steps.

1. Rewrite it as a system of partial DEs

This is easy – we just have

\displaystyle \partial S/\partial E = (V/E)^{1/2} \textrm{ and } \partial S / \partial V = (E/V)^{1/2}

2. Integrate w.r.t. E adding an integration function g(V)

Again we do what it says on the tin, namely

\displaystyle S(E,V) = 2 (EV)^{1/2} + g(V)

3. Substitute in the \partial S/\partial V equation to derive an ODE for g

We get dG/dV = 0 in this case, easy as.

4. Solve this ODE and write down the full answer

Immediately we know that g is just a constant function, so we can write

\displaystyle S(E,V) = 2 (EV)^{1/2} + \textrm{const}

Contrast this with the wrong answer from naively integrating up and adding each term. This would have produced 4(EV)^{1/2}, a factor of 2 out!

So what of my mysterious comment about limits above. Well, because dS is an exact differential, we know that we can integrate it over any path and will get the same answer. This path independence is an important reason that the entropy is a genuine physical quantity, whereas there’s no absolute notion of heat. In particular we can find S by integrating along the x' axis to x' = x then in the y' direction from (x',y')=(x,0) to (x',y')=(x,y).

Mathematically this looks like

\displaystyle S(E,V) = \int_{(0,0)}^{(E,V)} dS = \int_{(0,0)}^{(E,0)}(V'/E')^{1/2}dE' + \int_{(E,0)}^{(E,V)}(E'/V')^{1/2}dV'

The first integral now gives 0 since V=0 along the E axis. The second term gives the correct answer S(E,V) = 2(EV)^{1/2} as required.

In case you want a third way to solve this problem correctly, check out this webpage which proves another means of integrating differentials correctly!

So there you have it – your health warning for today. Take care when integrating your differentials!

Counting Microstates with Lateral Thinking

I’ve spent some time this morning stuck on the following undergraduate problem

Consider a quantum system with N distinguishable particles, each of which can have energy E=n\epsilon. Show that the number of microstates consistent with a macrostate of energy E is given by the binomial coefficient

\displaystyle \left(\frac{E/\epsilon + N - 1}{N - 1}\right)

This is a classic example of a problem which needs some lateral thinking! It’s pretty trivial when you get the right idea, but I think it’s not entirely obvious at first. I’ll share with you my (embarrassingly slow) reasoning – let me know whether you agree with my philosophy in the comments.

To start with, I tried some examples – attempting to arrange 3 particles in 5 energy levels for instance. My approach was to work out how I could partition 5 as a sum of smaller numbers, then evaluate the number of possible configurations consistent with this.

More concretely, I had a configuration where the particles had energies \{0,1,4\}. There are 3 \times 2=6 such arrangements, because the particles can be distinguished (think of them as different coloured balls, if you like).

That’s all very well, but generalizing this idea is tricky. The problem is that the total energy constraint \sum E_i = E makes it hard to enumerate all possible configurations. So I sat, stumped, for a good few minutes.

But thankfully, my failure contained a vital clue. My difficulties lay with that irritating total energy constraint. What if I could remove it from the problem altogether?

To do this requires a bit of lateral thinking. We’ve been trying to fit particles into energy levels. But you can turn the problem around. Equivalently we can try to distribute the E/\epsilon units of energy among the particles. This effectively trivializes the troublesome constraint.

We’re not quite out of the woods yet. We need to work out how to distribute the energy blocks into the particle buckets. Here a second piece of lateral thinking helps us out. Rather than throwing the energy into buckets, we can think of partitioning it into sections. It’s just like being at the supermarket till – different customers (particles) separate their shopping (energy) with plastic dividers.

So how many ways can we divvy up the shopping on the conveyor belt? Well, there are N customers so we’ll need N-1 dividers. We’ve also got E/\epsilon items ready to be bought. This means that you have

\displaystyle (E/\epsilon + N - 1)!

possible arrangements of the items and dividers. But hang on, every unit of energy looks exactly the same. It’s as if every customer has bought exactly the same product! And clearly it doesn’t matter if you exchange the dividers – the overall partition is unchanged. Taking this into account, the correct number of microstates is

\displaystyle \frac{(E/\epsilon + N - 1)!}{(E/\epsilon)!(N - 1)!}

This is exactly the binomial coefficient in the question above!

Although this problem was pretty simple, there are two important morals. First, always examine a problem from every angle. Second, never completely discard your failed attempts. Chances are they hold vital clues which will point you in the right direction!