Next: Canonical Ensemble Up: Classical Statistical Mechanics Previous: Classical Statistical Mechanics

Micro-canonical Ensemble

This approach is suitable for obtaining the thermodynamics of a mechanically, and thermally isolated system with fixed number of particles, volume and energy. Consider such an isolated system, defined by - $N, V, E \le H(p,q) \le E+\delta E$.

Definitions, (phase-space volume)

$\displaystyle \Gamma(E)$ $\textstyle \equiv$ $\displaystyle \int \limits_{E< H(p,q) < E + \delta E} dp  dq  \rho (p,q)$ (9)
$\displaystyle \Sigma(E)$ $\textstyle \equiv$ $\displaystyle \int\limits_{H < E} dp  dq  \rho(p,q)$ (10)

Therefore,
\begin{displaymath}
\Gamma (E) = \Sigma (E+\delta ) - \Sigma (E),
\end{displaymath} (11)

which could be written as,
$\displaystyle \Gamma(E)$ $\textstyle =$ $\displaystyle \omega (E) \delta,$ (12)
$\displaystyle \mbox{where},\; \; \omega (E)$ $\textstyle =$ $\displaystyle \frac{\partial \Sigma (E)}{\partial(E)},$ (13)

if $\delta \ll E$. The thermodynamics, for this micro-canonical ensemble approach, is obtained from the following equivalent definitions of the entropy,
$\displaystyle S$ $\textstyle =$ $\displaystyle k \ln \Gamma(E)$ (14)
$\displaystyle S$ $\textstyle =$ $\displaystyle k \ln \omega(E)$ (15)
$\displaystyle S$ $\textstyle =$ $\displaystyle k \ln \Sigma(E).$ (16)

which differ only by additive constants. This recipe is meaningful only when the entropy defined this way could be identified with the entropy defined by thermodynamics. Therefore, we must prove that,
  1. $S$ is extensive, and
  2. $S$ is in accordance with the second law of thermodynamics.
A. - Consider an isolated system defined by ($N, V, H(p,q)$). Now consider two imaginary subsystems of this system defined as,

\begin{displaymath}\left.
\begin{array}
{c}N_1,V_1,H_1 (p_1,q_1)\\
N_2,V_2....
...\langle H \rangle = \langle H_1 \rangle + \langle H_2 \rangle. \end{displaymath}

Let us assume that the interaction energy between the subsystems is very much smaller than either $H_1$ or $H_2$. Then,
\begin{displaymath}
H = H_1 (p_1, q_1) + H_2 (p_2, q_2).
\end{displaymath} (17)

This is true if - i) the range of inter-particle interaction is small, ii) the surface to volume ratio for each subsystem is small. Now, the entropy for the individual subsystems are given by,
$\displaystyle S_1$ $\textstyle =$ $\displaystyle k \ln \Gamma_1 (E_1),  (p_1, q_1),$ (18)
$\displaystyle S_2$ $\textstyle =$ $\displaystyle k \ln \Gamma_2 (E_2),  (p_2, q_2).$ (19)

In the composite space, spanning the entires system we must have, $N = N_1 + N_2$ and $E \le E_1 + E_2 \le E + 2 \delta$ such that the phase-space is spanned by $(p,q) \equiv (\{p_1,q_1\}, \{p_2, q_2\})$. Therefore, the phase-space volume of the whole system is given by,
\begin{displaymath}
\Gamma (E_1 + E_2) = \Gamma_1 (E_1) \Gamma_2 (E_2).
\end{displaymath} (20)

This is very much like the product space of two independent variables (for example, the area element in the $x-y$ plane, as shown in figure[2]).
Figure: Composite Space
\begin{figure}\begin{center}{\mbox{\epsfig{file=cs1.eps,width=100pt}}}\end{center}\end{figure}
Then, if $\delta$ be the unit of energy, we have,
\begin{displaymath}
\Gamma (E) = \sum_{i= 1}^ {E/ \delta } \Gamma_1 (E_i)  \Gamma_2 (E - E_i),
\end{displaymath} (21)

where the lower bound of $E_i$ is zero. Therefore, the entropy of the whole system is given by,
\begin{displaymath}
S(E,V) = k \ln \sum^{E/ \delta}_{i=1}  \Gamma_1(E_i)   \Gamma_2(E - E_i).
\end{displaymath} (22)

Suppose, $\Gamma_1(\bar E_1)  \Gamma_2(\bar E_2)$ be the largest term in the above series, with $E = \bar E_1 + \bar E_2$. Then,
    $\displaystyle \Gamma_1(\bar E_1)   \Gamma_2(\bar E_2)
\le \Gamma(E) \le \frac{E}{\delta} \Gamma _1(\bar E_1)   \Gamma_2(\bar E_2)$  
    $\displaystyle \Rightarrow
k \ln \left(\Gamma_1(\bar E_1)   \Gamma_2(\bar E_2)\...
...\left(\Gamma_1(\bar E_1)   \Gamma_2(\bar E_2)\right) + k \ln \frac{E}{\delta}.$ (23)

As, $N_1, N_2 \rightarrow \infty$, we have,

\begin{displaymath}
\left.
\begin{array}{cc}
\ln \Gamma_1 \rightarrow \infty ...
...S_1(\bar E_1, V_1) + S_2(\bar E_2, V_2) + O(\ln N).
\end{array}\end{displaymath}

Since, $\ln N$ is much slower than $N$, the last term can be neglected as $N \rightarrow \infty$ and this proves the extensive property of $S$.

Moreover, the energies of the subsystem have definite values $\bar E_1$ and $\bar E_2$ such that these values maximize $\Gamma_1( t_1)   \Gamma_2(E_2)$ with $E_1 + E_2 = E$. Therefore,

\begin{displaymath}
\delta [\Gamma_1(\bar E_1) \Gamma_2(\bar E_2)]= 0 , \quad \mbox{with}\quad
\delta E_1 + \delta E_2 = 0.
\end{displaymath} (24)

Therefore, it can be seen that,
\begin{displaymath}
\delta \ln [\Gamma_1(\bar E_1) \Gamma_2(\bar E_2)]= 0,
\end{displaymath} (25)

which implies that,
\begin{displaymath}
\frac{\partial S_1(E_1)}{\partial E_1} \left\vert _{E_1 = \b...
...partial S_2(E_2)}{\partial E_2} \right\vert _{E_2 = \bar E_2},
\end{displaymath} (26)

or, \fbox{$T_1^{-1} = T_2 ^{-1}$} since, $\frac{1}{T} = \frac{ \partial S}{\partial E}$. Therefore, we conclude that the two subsystems must have the same temperature. If this temperature scale is now chosen to be the kelvin scale the constant is identified with the Boltzmann constant : \fbox{$ k = k_B$}. Evidently, in an isolated system, $T$ is the parameter which governs the equilibrium between one part of the system with the other.

B. - The second law of thermodynamics states that if an isolated system undergoes a change of thermodynamic state such that the initial and the final states are equilibrium states, then the entropy of the final state is not smaller than the entropy of the initial state.

Consider an isolated system defined by $(N, E, V)$. Since, the system is isolated, $N$ and $E$ are fixed. And $V$ can never decrease without external intervention. Therefore, since, $V$ can only increase for an isolated system, the entropy, given by,

\begin{displaymath}
S = k_B \ln \int _{H<E} \rho  dp  dq
\end{displaymath} (27)

also can only increase (larger volume $\rightarrow$ larger phase-space volume).

Thermodynamics using Micro-canonical Ensemble - Consider quasi-static thermodynamics transformations where $E$ or $V$ are slowly varying. The system can be treated using micro-canonical formulation at each instant (time-intervals small compared to the time-scale of change) of time. The recipe is given by,

$\displaystyle \Gamma(E)$   $\displaystyle \quad \mbox{from} \quad H(p,q)$  
$\displaystyle S$ $\textstyle =$ $\displaystyle k_B \ln \Gamma(E),$ (28)
$\displaystyle U(S,V)$ $\textstyle =$ $\displaystyle \langle H(p,q) \rangle.$ (29)

The other thermodynamic variables like temperature or pressure are obtained the usual way, like, $T = (\frac{\partial U}{\partial S })_V$, $P = - (\frac{\partial U}{\partial V})_S$ and so on.

The Equipartition Theorem -
Ensemble average of $x_i \frac{\partial H}{\partial x _j}$ -

$\displaystyle \langle x_i \frac{\partial H}{\partial x_i}\rangle$ $\textstyle =$ $\displaystyle \frac{1}{\Gamma (E)} \int \limits_{E< H< E+\delta}
dp  dq  xi  \frac{\partial H}{\partial x_j}$  
  $\textstyle =$ $\displaystyle \frac{\delta}{\Gamma (E)} \frac{\partial}{\partial E} \int _{H<E}
dp  dq  x_i  \frac{\partial H}{\partial x_j}$  
  $\textstyle =$ $\displaystyle \frac{\delta}{\Gamma(E)}  \frac{\partial}{\partial E} \int_{H<E}
dp  dq  x_i  \frac{\partial (H-E)}{\partial x_j}$  
  $\textstyle =$ $\displaystyle \frac{\delta}{\Gamma (E)}  \frac{\partial }{\partial E}
\left[ \...
...eft\{ x_i ( H- E) \right\}
- \int_{H<E} dp  dq  \delta_{ij}  (H - E) \right]$  
  $\textstyle =$ $\displaystyle \frac{\delta}{\Gamma(E)}  \frac{\partial}{\partial E} \int _{H< E}
dp  dq  \delta_{ij}  (E-H)$  
  $\textstyle =$ $\displaystyle \frac{\delta_{ij}}{\omega (E)} \int_{H<E} dp  dq$  
  $\textstyle =$ $\displaystyle \frac{\delta_{ij}}{\omega (E)} \Sigma(E)
= \delta_{ij} \frac{\Sig...
...
= \delta_{ij} k_{\rm B} (\partial S/ \partial E)^{-1} = \delta_{ij} k_{\rm B}T$ (30)

If $x_i = q_i$ then $\sum \limits^{3 {\cal N}}_{i=1} \langle q_i  \dot p_i \rangle = k_{\rm B}T  3N $. This is the virial theorem.
For any $H= \sum \limits_i A_i p_i^{2} + \sum \limits _i B_i  q_i^2 $, we have,
\begin{displaymath}
\sum \limits_i \left( p_i \frac{\partial H}{\partial p_i}
+ q_i \frac{\partial H}{\partial q_i } \right) = 2 H.
\end{displaymath} (31)

Therefore, by the virial theorem, it can be seen that, $\langle H \rangle = 1/2 f k_{\rm B}T$ where $f$ is the number of degrees of freedom for the system. In general, if $H \propto p^{n}$ then the equipartition of energy gives $\langle H \rangle = 1/n f k_{\rm B} T$.

Specific Heat - $C_V = \frac{\partial U}{\partial T} = 1/2 fk_{\rm B}$. Therefore, the specific heat is proportional to the number of degrees of freedom. Since, a classical system have an infinite number of degrees of freedom, the specific heat should go to infinity, calculated this way. The solution lies in quantum physics which tells us that only those degrees of freedom are relevant which are excited in a particular process.

Quantum Correction to $\Gamma$ - The uncertainity principle of Quantum Mechanics tells us that there can be no simultaneous precise measurement of a pair of canonical co-ordinate and momenta. Hence, each $(p,q)$ point corresponding to a particle in the phase-space is not really a point but an area of size $h^3$. Therefore, for a system of $N$ particles, the elementary volume in the phase-space is $h^{3N}$. Hence, the correct evaluation of $\Gamma(E)$, which is nothing but the total number of possible micro-states accessible to the system, it should be divided by $h^{3N}$.

Gibbs Paradox - The Hamiltonian of a classical ideal gas of $N$ particles is $H = 1/{2m} \sum \limits_{i=1}^N p_i^2$. Then,

$\displaystyle \Sigma(E)$ $\textstyle =$ $\displaystyle \frac{1}{h^{3N}} \int \limits_{H \le E} dp  dq$  
  $\textstyle =$ $\displaystyle C_{3N} \left(\frac{V}{h^3}(2mE)^{3/2}\right)^N,$ (32)

where, $C_n = \pi^{n/2}/(n/2-1)!$. Therefore, the entropy of the ideal gas is,
\begin{displaymath}
S(E,V) = Nk_{\rm B} \ln \left(V(\frac{4\pi m E}{3 h^2 N})^{3/2} \right) + 3/2 N k_{\rm B}.
\end{displaymath} (33)

This can be written as,
$\displaystyle S$ $\textstyle =$ $\displaystyle Nk_{\rm B} \ln (V u^{3/2}) + Ns_0$ (34)
$\displaystyle u$ $\textstyle =$ $\displaystyle 3/2 k_{\rm B}T,$ (35)
$\displaystyle s_0$ $\textstyle =$ $\displaystyle 3k_{\rm B}/2 (1 + \ln \frac{4\pi m}{3h^2}).$ (36)

Consideer two ideal gases, with $N_1$ and $N_2$ particles respectivly, kept in two separte volumes $V_1$ and $V_2$ at the same temperature. The change in the entropy of the combined system after the gases are allowed to mix in a volume $V = V_1 + V_2$ is given by,
\begin{displaymath}
\frac{\delta S}{k_{\rm B}} = N_1 \ln \frac{V}{V_1} + N_2 \ln \frac{V}{V_2} \ge 0.
\end{displaymath} (37)

If the two gases are different this result matches with the experiment. However, if the gases are the same there should not be any change in the entropy. The paradox was solved by Gibbs by introducing a factor $N!$ by which $\Gamma(E)$ should be divided in order to produce the correct result. This comes from the fact that Quantum Mechanically the particles are indistinguishable and therefore $N!$ different microscopic configurations are actually the same. This is known as the current Boltzmann counting.



Next: Canonical Ensemble Up: Classical Statistical Mechanics Previous: Classical Statistical Mechanics
Sushan Konar 2004-08-19