World Library  
Flag as Inappropriate
Email this Article

Statistical thermodynamics

Article Id: WHEBN0000170158
Reproduction Date:

Title: Statistical thermodynamics  
Author: World Heritage Encyclopedia
Language: English
Subject: Thermodynamics, List of fields of application of statistics
Collection:
Publisher: World Heritage Encyclopedia
Publication
Date:
 

Statistical thermodynamics

Statistical mechanics or statistical thermodynamics[note 1] is a branch of physics that applies probability theory, which contains mathematical tools for dealing with large populations, to the study of the thermodynamic behavior of systems composed of a large number of particles. Statistical mechanics provides a framework for relating the microscopic properties of individual atoms and molecules to the macroscopic bulk properties of materials that can be observed in everyday life, thereby explaining thermodynamics as a result of the classical- and quantum-mechanical descriptions of statistics and mechanics at the microscopic level.

Statistical mechanics provides a molecular-level interpretation of macroscopic thermodynamic quantities such as work, heat, free energy, and entropy. It enables the thermodynamic properties of bulk materials to be related to the spectroscopic data of individual molecules. This ability to make macroscopic predictions based on microscopic properties is the main advantage of statistical mechanics over classical thermodynamics. Both theories are governed by the second law of thermodynamics through the medium of entropy. However, entropy in thermodynamics can only be known empirically, whereas in statistical mechanics, it is a function of the probability distribution of the system on its micro-states.

Statistical mechanics was initiated in 1870 with the work of Austrian physicist Ludwig Boltzmann, much of which was collectively published in Boltzmann's 1896 Lectures on Gas Theory.[1] Boltzmann's original papers on the statistical interpretation of thermodynamics, the H-theorem, transport theory, thermal equilibrium, the equation of state of gases, and similar subjects, occupy about 2,000 pages in the proceedings of the Vienna Academy and other societies. The term "statistical thermodynamics" was proposed for use by the American thermodynamicist and physical chemist J. Willard Gibbs in 1902. According to Gibbs, the term "statistical", in the context of mechanics, i.e. statistical mechanics, was first used by the Scottish physicist James Clerk Maxwell in 1871. "Probabilistic mechanics" might today seem a more appropriate term, but "statistical mechanics" is firmly entrenched.[2]

Overview

The essential problem in statistical thermodynamics is to calculate the distribution of a given amount of energy E over N identical systems.[3] The goal of statistical thermodynamics is to understand and to interpret the measurable macroscopic properties of materials in terms of the properties of their constituent particles and the interactions between them. This is done by connecting thermodynamic functions to quantum-mechanical equations. Two central quantities in statistical thermodynamics are the Boltzmann factor and the partition function.

Fundamentals

Template:World Heritage Encyclopedia books link

Central topics covered in statistical thermodynamics include:

Lastly, and most importantly, the formal definition of entropy of a thermodynamic system from a statistical perspective is called statistical entropy, and is defined as:

S = k_B \ln \Omega \!

where kB is Boltzmann's constant 1.38066×10−23 J K−1 and Ω is the number of microstates corresponding to the observed thermodynamic macrostate.

This equation is valid only if each microstate is equally accessible (each microstate has an equal probability of occurring).

Boltzmann distribution

If the system is large the Boltzmann distribution could be used (the Boltzmann distribution is an approximate result)

n_i \propto \exp\left(-\frac {U_i}{k_B T}\right), \,

where ni stands for the number of particles occupying level i or the number of feasible microstates corresponding to macrostate i; Ui stands for the energy of i; T stands for temperature; and kB is the Boltzmann constant.

If N is the total number of particles or states, the distribution of probability densities follows:

\rho _i \equiv \frac {n_i}{N} = \frac {\exp\left(-\frac {U_i}{k_B T}\right)} { \sum_j \exp\left(-\frac {U_j}{k_B T}\right)},

where the sum in the denominator is over all levels.

History

In 1738, Swiss physicist and mathematician Daniel Bernoulli published Hydrodynamica which laid the basis for the kinetic theory of gases. In this work, Bernoulli posited the argument, still used to this day, that gases consist of great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as heat is simply the kinetic energy of their motion.

In 1859, after reading a paper on the diffusion of molecules by Rudolf Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics.[4] Five years later, in 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his life developing the subject further.

Hence, the foundations of statistical thermodynamics were laid down in the late 1800s by those such as Maxwell, Boltzmann, Max Planck, Clausius, and Josiah Willard Gibbs who began to apply statistical and quantum atomic theory to ideal gas bodies. Predominantly, however, it was Maxwell and Boltzmann, working independently, who reached similar conclusions as to the statistical nature of gaseous bodies. Yet, one must consider Boltzmann to be the "father" of statistical thermodynamics with his 1875 derivation of the relationship between entropy S and multiplicity Ω, the number of microscopic arrangements (microstates) producing the same macroscopic state (macrostate) for a particular system.[5]

Fundamental postulate

The fundamental postulate in statistical mechanics (also known as the equal a priori probability postulate) is the following:

Given an isolated system in equilibrium, it is found with equal probability in each of its accessible microstates.

This postulate is a fundamental assumption in statistical mechanics - it states that a system in equilibrium does not have any preference for any of its available microstates. Given Ω microstates at a particular energy, the probability of finding the system in a particular microstate is p = 1/Ω.

This postulate is necessary because it allows one to conclude that for a system at equilibrium, the thermodynamic state (macrostate) that could result from the largest number of microstates is also the most probable macrostate of the system.

The postulate is justified in part, for classical systems, by Liouville's theorem (Hamiltonian), which shows that if the distribution of system points through accessible phase space is uniform at some time, it remains so at later times.

Similar justification for a discrete system is provided by the mechanism of detailed balance.

This allows for the definition of the information function (in the context of information theory):

I = - \sum_i \rho_i \ln\rho_i = \langle -\ln \rho \rangle. When all the probabilities ρi (rho) are equal, I is maximal, and we have minimal information about the system. When our information is maximal (i.e., one rho is equal to one and the rest to zero, such that we know what state the system is in), the function is minimal.

This information function is the same as the reduced entropic function in thermodynamics.

Mark Srednicki has argued that for quantum mechanical systems, the fundamental postulate can be derived, in the semi-classical limit, assuming only that Berry's conjecture (named after Michael Berry) applies to the system in question.[6][7] Berry's conjecture is believed to hold only for systems with a chaotic classical limit, and roughly says that the energy eigenstates are distributed as Gaussian random variables. Since all realistic systems with more than a handful of degrees of freedom are expected to be chaotic, this puts the fundamental postulate on firm footing. Berry's conjecture has also been shown to be equivalent to an information theoretic principle of least bias.[8] The generalization of this idea beyond the semi-classical limit is the basis for the Eigenstate thermalization hypothesis,[9] which purports to explain when and why an isolated quantum mechanical system can be accurately described using equilibrium statistical mechanics.

Statistical ensembles

The modern formulation of statistical mechanics is based on the description of the physical system by an ensemble that represents all possible configurations of the system and the probability of realizing each configuration.

Each ensemble is associated with a partition function that, with mathematical manipulation, can be used to extract values of thermodynamic properties of the system. According to the relationship of the system to the rest of the universe, one of three general types of ensembles may apply, in order of increasing complexity:

  • Microcanonical ensemble: describes a completely isolated system, having constant energy, as it does not exchange energy or mass with the rest of the universe.
  • Canonical ensemble: describes a system in thermal equilibrium with its environment. It may only exchange energy in the form of heat with the outside.
  • Grand-canonical ensemble: used in open systems which exchange energy and mass with the outside.
Summary of ensembles Ensembles
Microcanonical Canonical Grand canonical
Variables (suppressed

constant for ensemble)

E, N, V T, N, V T, μ, V
Microscopic features
  • Canonical partition function
  • Z = \sum_k e^{-\beta E_k}
  • Grand canonical partition function
  • \Xi = \sum_k e^{ -\beta (E_k - \mu N_k ) }
Macroscopic function S = k_B \ln \Omega F = - k_B T \ln Z F - \mu N =- k_B T \ln \Xi

Microcanonical ensemble

In the microcanonical ensemble, the number of particles, the system's volume, and the system's energy (N, V and E) are fixed. Since the second law of thermodynamics applies to isolated systems, the first case investigated will correspond to this case. The Microcanonical ensemble describes an isolated system.

The entropy of such a system can only increase, so that the maximum of its entropy corresponds to an equilibrium state for the system.

Because an isolated system keeps a constant energy, the total energy of the system does not fluctuate. Thus, the system can access only those of its micro-states that correspond to a given value E of the energy. The internal energy of the system is then strictly equal to its energy.

Let Ω(E) be the number of micro-states corresponding to this value of the system's energy. The macroscopic state of maximal entropy for the system is the one in which all micro-states are equally likely to occur, with probability 1/Ω(E), during the system's fluctuations, and we have for the system's entropy:

S=-k_B\sum_{i=1}^{\Omega (E)} \left[ {1\over{\Omega (E)}} \ln{1\over{\Omega (E)}} \right ] =k_B\ln \left(\Omega (E) \right)

Canonical ensemble

Main article: Canonical ensemble

In canonical ensemble the number of particles, the system's volume and the system's temperature, N, V and T, are fixed. Invoking the concept of the canonical ensemble, it is possible to derive the probability Pi that a macroscopic system in thermal equilibrium with its environment, will be in a given microstate with energy Ei according to the Boltzmann distribution:

P_i = {e^{-\beta E_i}\over{\sum_{j=1}^{j_{\rm max}}e^{-\beta E_j}}}

using the useful definition \beta={1\over{k_B T}}, known as the Thermodynamic beta or inverse temperature.

The temperature T arises from the fact that the system is in thermal equilibrium with its environment. The probabilities of the various microstates must add to one, and the normalization factor in the denominator is the canonical partition function:

Z = \sum_{i=1}^{i_{\rm max}} e^{-\beta E_i}

where Ei is the energy of the ith microstate of the system. The partition function is a measure of the number of states accessible to the system at a given temperature. The article canonical ensemble contains a derivation of Boltzmann's factor and the form of the partition function from first principles.

To sum up, the probability of finding a system at temperature T in a particular state with energy Ei is

P_i = \frac{e^{-\beta E_i}}{Z}.

Thus the partition function looks like the weight factor for the ensemble.

Thermodynamic connection

The partition function can be used to find the expected (average) value of any microscopic property of the system, which can then be related to macroscopic variables. For instance, the expected value of the microscopic energy E is interpreted as the microscopic definition of the thermodynamic variable internal energy U, and can be obtained by taking the derivative of the partition function with respect to the temperature. Indeed,

\langle E\rangle={\sum_i E_i e^{-\beta E_i}\over Z}=-{1 \over Z} {dZ \over d\beta}

implies, together with the interpretation of \langle E\rangle as U, the following microscopic definition of internal energy:

U\colon = -{d\ln Z\over d \beta}.

The entropy can be calculated by (see Shannon entropy)

{S\over k} = - \sum_i p_i \ln p_i = \sum_i {e^{-\beta E_i}\over Z}(\beta E_i+\ln Z) = \ln Z + \beta U

which implies that

-\frac{\ln(Z)}{\beta} = U - TS = F

is the Helmholtz free energy of the system or in other words,

Z=e^{-\beta F}.\,

Having microscopic expressions for the basic thermodynamic potentials U (internal energy), S (entropy) and F (free energy) is sufficient to derive expressions for other thermodynamic quantities. The basic strategy is as follows. There may be an intensive or extensive quantity that enters explicitly in the expression for the microscopic energy Ei, for instance magnetic field (intensive) or volume (extensive). Then, the conjugate thermodynamic variables are derivatives of the internal energy. The macroscopic magnetization (extensive) is the derivative of U with respect to the (intensive) magnetic field, and the pressure (intensive) is the derivative of U with respect to volume (extensive).

The treatment in this section assumes no exchange of matter (i.e. fixed mass and fixed particle numbers). However, the volume of the system is variable which means the density is also variable.

This probability can be used to find the average value, which corresponds to the macroscopic value, of any property, J, that depends on the energetic state of the system by using the formula:

\langle J \rangle = \sum_i p_i J_i = \sum_i J_i \frac{e^{-\beta E_i}}{Z}

where \langle J \rangle is the average value of property J. This equation can be applied to the internal energy, U:

U = \sum_i E_i \frac{e^{-\beta E_i}}{Z}.

Subsequently, these equations can be combined with known thermodynamic relationships between U and V to arrive at an expression for pressure in terms of only temperature, volume and the partition function. Similar relationships in terms of the partition function can be derived for other thermodynamic properties as shown in the following table; [note 2]

Helmholtz free energy: F = - {\ln Z\over \beta}
Internal energy: U = -\left( \frac{\partial\ln Z}{\partial\beta} \right)_{N,V}
Pressure: P = -\left({\partial F\over \partial V}\right)_{N,T}= {1\over \beta} \left( \frac{\partial \ln Z}{\partial V} \right)_{N,T}
Entropy: S = k (\ln Z + \beta U)\,
Gibbs free energy: G = F+PV=-{\ln Z\over \beta} + {V\over \beta} \left( \frac{\partial \ln Z}{\partial V}\right)_{N,T}
Enthalpy: H = U + PV\,
Constant volume heat capacity: C_V = \left( \frac{\partial U}{\partial T} \right)_{N,V}
Constant pressure heat capacity: C_P = \left( \frac{\partial H}{\partial T} \right)_{N,P}
Chemical potential: \mu_i = -{1\over \beta} \left( \frac{\partial \ln Z}{\partial N_i} \right)_{T,V,N}

To clarify, this is not a grand canonical ensemble.

It is often useful to consider the energy of a given molecule to be distributed among a number of modes. For example, translational energy refers to that portion of energy associated with the motion of the center of mass of the molecule. Configurational energy refers to that portion of energy associated with the various attractive and repulsive forces between molecules in a system. The other modes are all considered to be internal to each molecule. They include rotational, vibrational, electronic and nuclear modes. If we assume that each mode is independent (a questionable assumption) the total energy can be expressed as the sum of each of the components:

E = E_t + E_c + E_n + E_e + E_r + E_v,\,

where the subscripts t, c, n, e, r, and v correspond to translational, configurational, nuclear, electronic, rotational and vibrational modes, respectively. The relationship in this equation can be substituted into the very first equation to give:

Z = \sum_i e^{-\beta(E_{ti} + E_{ci} + E_{ni} + E_{ei} + E_{ri} + E_{vi})}
= \sum_i

e^{-\beta E_{ti}} e^{-\beta E_{ci}} e^{-\beta E_{ni}} e^{-\beta E_{ei}} e^{-\beta E_{ri}} e^{-\beta E_{vi}}.

If we can assume all these modes are completely uncoupled and uncorrelated, so all these factors are in a probability sense completely independent, then

Z = Z_t Z_c Z_n Z_e Z_r Z_v.\,

Thus a partition function can be defined for each mode. Simple expressions have been derived relating each of the various modes to various measurable molecular properties, such as the characteristic rotational or vibrational frequencies.

Expressions for the various molecular partition functions are shown in the following table.

Nuclear Z_n = 1 \qquad (T < 10^8 K)
Electronic Z_e = W_0 e^{kT D_e} + W_1 e^{-\theta_{e1}/T} + \cdots
Vibrational Z_v = \prod_j \frac{e^{-\theta_{vj} / 2T}}{1-e^{-\theta_{vj} / T}}
Rotational (linear) Z_r = \frac{T}{\sigma\theta_r}
Rotational (non-linear) Z_r = \frac{1}{\sigma}\sqrt{\frac
Translational Z_t = \frac{(2 \pi mkT)^{3/2}}{h^3}
Configurational (ideal gas) Z_c = V\,

These equations can be combined with those in the first table to determine the contribution of a particular energy mode to a thermodynamic property. For example the "rotational pressure" could be determined in this manner. The total pressure could be found by summing the pressure contributions from all of the individual modes, i.e.:

P = P_t + P_c + P_n + P_e + P_r + P_v.\,

Grand canonical ensemble

In a grand canonical ensemble, the system's volume, temperature and chemical potential, V, T, μ, are fixed. If the system under study is an open system (in which matter can be exchanged with the environment), but particle number is not conserved, we would have to introduce chemical potentials, μj, j = 1,...,n and replace the canonical partition function with the grand canonical partition function:

\Xi(V,T,\mu) = \sum_i \exp\left[\beta \left(\sum_{j=1}^n \mu_j N_{ij}-E_i\right )\right]

where Nij is the number of jth species particles in the ith configuration. Sometimes, we also have other variables to add to the partition function, one corresponding to each conserved quantity. Most of them, however, can be safely interpreted as chemical potentials. In most condensed matter systems, things are nonrelativistic and mass is conserved. However, most condensed matter systems of interest also conserve particle number approximately (metastably) and the mass (nonrelativistically) is none other than the sum of the number of each type of particle times its mass. Mass is inversely related to density, which is the conjugate variable to pressure. For the rest of this article, we will ignore this complication and pretend chemical potentials don't matter.

Let's rework everything using a grand canonical ensemble this time. The volume is left fixed and does not figure in at all in this treatment. As before, j is the index for those particles of species j and i is the index for microstate i:

U = \sum_i E_i \frac{\exp(-\beta (E_i-\sum_j \mu_j N_{ij}))}{\Xi}
N_j = \sum_i N_{ij} \frac{\exp(-\beta (E_i-\sum_j \mu_j N_{ij}))}{\Xi}.
Grand potential: \Phi_{G} = - {\ln \Xi\over \beta}
Internal energy: U = -\left( \frac{\partial\ln \Xi}{\partial\beta} \right)_{\mu}+\sum_i{\mu_i\over\beta}\left({\partial \ln \Xi\over \partial \mu_i}\right )_{\beta}
Particle number: N_i={1\over\beta}\left({\partial \ln \Xi\over \partial \mu_i}\right)_\beta
Entropy: S = k (\ln \Xi + \beta U- \beta \sum_i \mu_i N_i)\,
Helmholtz free energy: F = \Phi_{G}+\sum_i \mu_i N_i=-{\ln \Xi\over \beta} +\sum_i{\mu_i\over \beta} \left( \frac{\partial \ln \Xi}{\partial \mu_i}\right)_{\beta}

Equivalence between descriptions at the thermodynamic limit

All of the above descriptions differ in the way they allow the given system to fluctuate between its configurations.

In the micro-canonical ensemble, the system exchanges no energy with the outside world, and is therefore not subject to energy fluctuations; in the canonical ensemble, the system is free to exchange energy with the outside in the form of heat.

In the thermodynamic limit, which is the limit of large systems, fluctuations become negligible, so that all these descriptions converge to the same description. In other words, the macroscopic behavior of a system does not depend on the particular ensemble used for its description.

Given these considerations, the best ensemble to choose for the calculation of the properties of a macroscopic system is that ensemble which allows the result to be derived most easily.

Classical thermodynamics vs. statistical thermodynamics

As an example, from a classical thermodynamics point of view one might ask what is it about a thermodynamic system of gas molecules, such as ammonia NH3, that determines the free energy characteristic of that compound? Classical thermodynamics does not provide the answer. If, for example, we were given spectroscopic data, of this body of gas molecules, such as bond length, bond angle, bond rotation, and flexibility of the bonds in NH3 we should see that the free energy could not be other than it is. To prove this true, we need to bridge the gap between the microscopic realm of atoms and molecules and the macroscopic realm of classical thermodynamics. From physics, statistical mechanics provides such a bridge by teaching us how to conceive of a thermodynamic system as an assembly of units. More specifically, it demonstrates how the thermodynamic parameters of a system, such as temperature and pressure, are interpretable in terms of the parameters descriptive of such constituent atoms and molecules.[10]

In a bounded system, the crucial characteristic of these microscopic units is that their energies are quantized. That is, where the energies accessible to a macroscopic system form a virtual continuum of possibilities, the energies open to any of its submicroscopic components are limited to a discontinuous set of alternatives associated with integral values of some quantum number.

See also

A Table of Statistical Mechanics Articles
Maxwell-Boltzmann Bose-Einstein Fermi-Dirac
Particle Boson Fermion
Statistics

Partition function
Statistical properties
Microcanonical ensemble | Canonical ensemble | Grand canonical ensemble

Statistics

Maxwell-Boltzmann statistics
Maxwell-Boltzmann distribution
Boltzmann distribution
Gibbs paradox

Bose-Einstein statistics Fermi-Dirac statistics
Thomas-Fermi
approximation
gas in a box
gas in a harmonic trap
Gas Ideal gas

Bose gas
Debye model
Bose-Einstein condensate
Planck's law of black body radiation

Fermi gas
Fermion condensate

Chemical
Equilibrium
Classical Chemical equilibrium

Notes

References

Bibliography

Further reading

External links

  • Stanford Encyclopedia of Philosophy.
  • Sklogwiki - Thermodynamics, statistical mechanics, and the computer simulation of materials. SklogWiki is particularly orientated towards liquids and soft condensed matter.
  • Statistical Thermodynamics - Historical Timeline
  • Thermodynamics and Statistical Mechanics by Richard Fitzpatrick
  • Lecture Notes in Statistical Mechanics and Mesoscopics by Doron Cohen

Template:Physics-footer

fr:Physique statistique
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.
 


Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.