Interpretation of the partition function

During the exercise class we’ve encountered quite a few questions about the partition function and how to use it to compute the expectation values. I admit: when studying statistical mechanics a while back, I didn’t get the point of partition function, and it all seemed like voodoo. Much later I learned how to relate the partition function to expectation values of various quantities through probability theory.

We begin from saying that we have a system that may be in a lot of different states \{i\}, with each state having an energy E_i. For example in the Einstein model these are the harmonic oscillator states, and E_n = \hbar \omega_0(n + 1/2).

Because of Boltzmann statistics, the probability of the system to occupy each individual state p_i is proportional to the Boltzmann factor: p_i \propto \exp(-\beta E_i). Because probabilities must add to 1, we of course must normalize them:

p_i = \frac{\exp(-\beta E_i)}{\sum_j \exp(-\beta E_j)}.

The expectation value of any quantity X that depends on the system state is the one familiar from probability theory:

\langle X \rangle = \sum_i X_i p_i = \frac{\sum_i X_i\exp(-\beta E_i)}{\sum_i \exp(-\beta E_i)}.

Now in principle, this is all there is to computing averages of thermodynamic quantities: carry out the infinite sum over i and you are done. It turns out, however, that there is a more convenient way to express these expectation values. For that we can observe two facts about derivatives:

  1. d \log f(x) / dx = f'(x)/f(x)
  2. d \exp a x / d x = a \exp a x

Together these invite us to introduce the partition function Z = \sum_i \exp(-\beta E_i). With this at hand, we notice that

\langle E \rangle = -d \log Z / d\beta.

The trick is that the logarithm gives us the 1/Z denominator and the derivative with respect to \beta gives the E_i prefactor to each state!
Other statistical mechanics expectation values come in a similar way. Let’s say we want to compute an arbitrary quantity X. We then extend the partition function to be

Z = \sum_i \exp(-\beta E_i - X_i \lambda),

and notice that by the same idea

\langle X \rangle = \left.-\frac{d \log Z}{d \lambda}\right|_{\lambda = 0}.

And that’s really the only meaning behind the partition function!


P.S. In our specific problem when E_i = \hbar \omega_0 (i + 1/2), we should notice that e^{\beta E_i} = e^{\hbar \omega_0 \beta / 2} \times \left(e^{\hbar \omega_0 \beta}\right)^i, and compute the sum using geometric series. Hope that helps.

2 Likes

P.P.S. Related to the topic of partition functions, but very much unrelated to the course, there’s a :exploding_head: advanced technique for computing statistical properties of systems—the replica trick.

In short, the idea is to compute the partition function of n independent copies—replicas—of the system, and then to take the limit n\to 0. Magically this allows to compute the expression for \log Z :exploding_head:.

In the first expression for <X>, are you missing a sum over i, or does this disappear (on the right hand side)?

1 Like

Fixed (was a typo).