World Library  
Flag as Inappropriate
Email this Article

Third law of thermodynamics

Article Id: WHEBN0000225617
Reproduction Date:

Title: Third law of thermodynamics  
Author: World Heritage Encyclopedia
Language: English
Subject: Temperature, Thermodynamics, Entropy, Second law of thermodynamics, Ideal gas
Publisher: World Heritage Encyclopedia

Third law of thermodynamics

The third law of thermodynamics is sometimes stated as follows, regarding the properties of systems in equilibrium at absolute zero temperature:

The entropy of a perfect crystal, at absolute zero (zero kelvins), is exactly equal to zero.

At zero kelvin the system must be in a state with the minimum possible energy, and this statement of the third law holds true if the perfect crystal has only one minimum energy state. Entropy is related to the number of possible microstates, and for a system containing a certain collection of particles, quantum mechanics indicates that there is only one unique state (called the ground state) with minimum energy.[1] If the system does not have a well-defined order (if its order is glassy, for example), then in practice there will remain some finite entropy as the system is brought to very low temperatures as the system becomes locked into a configuration with non-minimal energy. The constant value is called the residual entropy of the system.[2]

The Nernst–Simon statement of the third law of thermodynamics is in regard to thermodynamic processes, and whether it is possible to achieve absolute zero in practice:

The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as temperature approaches 0 K, where condensed system refers to liquids and solids.

A simpler formulation of the Nernst–Simon statement might be:

It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.

Physically, the Nernst–Simon statement implies that it is impossible for any procedure to bring a system to the absolute zero of temperature in a finite number of steps.[3]


The 3rd law was developed by the chemist Walther Nernst during the years 1906–12, and is therefore often referred to as Nernst's theorem or Nernst's postulate. The third law of thermodynamics states that the entropy of a system at absolute zero is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.

In 1912 Nernst stated the law thus: "It is impossible for any procedure to lead to the isotherm T = 0 in a finite number of steps."[4]

An alternative version of the third law of thermodynamics as stated by Gilbert N. Lewis and Merle Randall in 1923:

If the entropy of each element in some (perfect) crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.

This version states not only ΔS will reach zero at 0 K, but S itself will also reach zero as long as the crystal has a ground state with only one configuration. Some crystals form defects which causes a residual entropy. This residual entropy disappears when the kinetic barriers to transitioning to one ground state are overcome.[5]

With the development of statistical mechanics, the third law of thermodynamics (like the other laws) changed from a fundamental law (justified by experiments) to a derived law (derived from even more basic laws). The basic law from which it is primarily derived is the statistical-mechanics definition of entropy for a large system:

S - S_0 = k_B \ln \, \Omega \

where S is entropy, kB is the Boltzmann constant, and \Omega is the number of microstates consistent with the macroscopic configuration. The counting of states is from the reference state of absolute zero, which corresponds to the entropy of S0.


In simple terms, the third law states that the entropy of a perfect crystal of a pure substance approaches zero as the absolute temperature approaches zero. The alignment of a perfect crystal leaves no ambiguity as to the position of the components of the system and the orientation of each part of the crystal is identical. As the energy of the crystal is reduced, the unique vibrations of each atom are reduced to nothing. At that point no part of the crystal is unique, hence it is in essence one thing. This law provides an absolute reference point for the determination of entropy at any other temperature. Any increase in the entropy of a system, determined relative to this zero point, is the absolute entropy of that system. Mathematically, the absolute entropy of any system at zero temperature is the natural log of the number of ground states times Boltzmann's constant kB=1.38x10−23, JK−1.

The entropy of a perfect crystal lattice as defined by Nernst's theorem is zero provided that its ground state is unique, because ln(1) = 0. If the system is composed of one-billion atoms, all alike, and lie within the matrix of a perfect crystal, the number of permutations of one-billion identical things taken one-billion at a time is Ω = 1. Hence:

S - S_0 = k_B \ln\Omega = k_B\ln{1} = 0

The difference is zero, hence the initial entropy S0 can be any selected value so long as all other such calculations include that as the initial entropy. As a result the initial entropy value of zero is selected S0 = 0 is used for convenience.

S - S_0 = S - 0 = 0
S = 0

In way of example, suppose a system consists of 1 cm3 of matter with a mass of 1 g and 20 g/gmole. The system consists of 3x1022 identical atoms at 0 K. If one atom should absorb a photon of wavelength of 1 cm that atom is then unique and the permutations of one unique atom among the 3x1022 is N=3x1022. The entropy, energy, and temperature of the system rises and can be calculated. The entropy change is:

\Delta S = S - S_{0} = k_{B} \ln{\Omega}

From the second law of thermodynamics:

\Delta S = S - S_0 = \frac{\delta Q}{T}


\Delta S = S - S_0 = k_B ln(\Omega) = \frac{ \delta Q}{T}

Calculating entropy change:

S - 0 = k_B \ln{N} = 1.38 \times 10^{-23} * \ln{3 \times 10^{22}} =70 \times 10^{-23}J/K

The energy change of the system as a result of absorbing the single photon whose energy is ε:

\delta Q = \epsilon = \frac {hc}{\gamma} =\frac{6.62 \times 10^{-34}J\cdot s * 3 \times 10^{8} m/s}{0.01 m}=2 \times 10^{-23} J

The temperature of the system rises by:

T = \frac{\epsilon}{\Delta S} = \frac{2 \times 10^{-23}J}{70 \times 10^{-23}J/K} = \frac{1}{35} K

This can be interpreted as the average temperature of the system over the range from 0 < S < 70x10−23 J/K[6] A single atom was assumed to absorb the photon but the temperature and entropy change characterizes the entire system.

An example of a system which does not have a unique ground state is one whose net spin is a half-integer, for which time-reversal symmetry gives two degenerate ground states. For such systems, the entropy at zero temperature is at least kB*ln(2) (which is negligible on a macroscopic scale). Some crystalline systems exhibit geometrical frustration, where the structure of the crystal lattice prevents the emergence of a unique ground state. Ground-state helium (unless under pressure) remains liquid.

In addition, glasses and solid solutions retain large entropy at 0 K, because they are large collections of nearly degenerate states, in which they become trapped out of equilibrium. Another example of a solid with many nearly-degenerate ground states, trapped out of equilibrium, is ice Ih, which has "proton disorder".

For the entropy at absolute zero to be zero, the magnetic moments of a perfectly ordered crystal must themselves be perfectly ordered; from an entropic perspective, this can be considered to be part of the definition of a "perfect crystal". Only ferromagnetic, antiferromagnetic, and diamagnetic materials can satisfy this condition. Materials that remain paramagnetic at 0 K, by contrast, may have many nearly-degenerate ground states (for example, in a spin glass), or may retain dynamic disorder (a quantum spin liquid).

Mathematical formulation

Consider a closed system in internal equilibrium. As the system is in equilibrium there are no irreversible processes so the entropy production is zero. During the heat, supply temperature gradients are generated in the material, but the associated entropy production can be kept low enough if the heat is supplied slowly. The increase in entropy due to the added heat δQ is then given by the second part of the Second law of thermodynamics which states that the entropy change of a system is given by
\Delta S = \frac{\delta Q}{T}. (1)
The temperature rise dT due to the heat δQ is determined by the heat capacity C(T,X) according to
\delta Q=C(T,X) \mathrm{d}T. (2)
The parameter X is a symbolic notation for all parameters (such as pressure, magnetic field, liquid/solid fraction, etc.) which are kept constant during the heat supply. E.g. if the volume is constant we get the heat capacity at constant volume CV. In the case of a phase transition from liquid to solid, or from gas to liquid the parameter X can be one of the two components. Combining relations (1) and (2) gives
\Delta S = \frac{C(T,X) \mathrm{d}T}{T}. (3)
Integration of Eq.(3) from a reference temperature T0 to an arbitrary temperature T gives the entropy at temperature T
S(T,X) = S(T_0,X) + \int_{T_0}^{T} \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (4)

We now come to the mathematical formulation of the third law. There are three steps:

1: in the limit T0→0 the integral in Eq.(4) is finite. So that we may take T0=0 and write
S(T,X)=S(0,X) + \int_0^T \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (5)
2. the value of S(0,X) is independent of X. In mathematical form
S(0,X) = S(0). (6)
So Eq.(5) can be further simplified to
S(T,X)=S(0) + \int_0^T \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (7)
Equation (6) can also be formulated as
\lim_{T \rightarrow 0}\left( \frac { \part S(T,X)}{ \part X}\right)_T = 0. (8)

In words: at absolute zero all isothermal processes are isentropic. Eq.(8) is the mathematical formulation of the third law.

3: as one is free to choose the zero of the entropy it is convenient to take
S(0)=0 (9)
so that Eq.(7) reduces to the final form
S(T,X) = \int_0^T \frac {C(T^\prime,X)}{T^\prime}\mathrm{d}T^\prime. (10)

The physical meaning of Eq.(9) is deeper than just a convenient selection of the zero of the entropy. It is due to the perfect order at zero kelvin as explained before.

Consequences of the third law

Fig.1 Left side: Absolute zero can be reached in a finite number of steps if S(0,X1)≠S(0, X2). Right: An infinite number of steps is needed since S(0,X1)= S(0,X2).

Can absolute zero be obtained?

The third law is equivalent to the statement that

"It is impossible by any procedure, no matter how idealized, to reduce the temperature of any system to zero temperature in a finite number of finite operations".[7]

The reason that T=0 cannot be reached according to the third law is explained as follows: Suppose that the temperature of a substance can be reduced in an isentropic process by changing the parameter X from X2 to X1. One can think of a multistage nuclear demagnetization setup where a magnetic field is switched on and off in a controlled way.[8] If there were an entropy difference at absolute zero, T=0 could be reached in a finite number of steps. However, at T=0 there is no entropy difference so an infinite number of steps would be needed. The process is illustrated in Fig.1.

Specific heat

A non-quantitative description of his third law that Nernst gave at the very beginning was simply that the specific heat can always be made zero by cooling the material down far enough.[9] A modern, quantitative analysis follows. Suppose that the heat capacity of a sample in the low temperature region can be approximated by C(T,X)=C0Tα, then
\int_{T_0}^T \frac {C(T^\prime,X)}{T^\prime}dT^\prime = \frac {C_0}{ \alpha}(T^{ \alpha}-T_0^{ \alpha}). (11)
The integral is finite for T0→0 if α>0. So the heat capacity of all substances must go to zero at absolute zero
\lim_{T \rightarrow 0}C(T,X)=0. (12)
The molar specific heat at constant volume of a monatomic classical ideal gas, such as helium at room temperature, is given by CV=(3/2)R with R the molar ideal gas constant. Substitution in Eq.(4) gives
S(T,V) = S(T_0,V) + \frac{3}{2}R \ln \frac{T}{T_0}. (13)

In the limit T0→0 this expression diverges. Clearly a constant heat capacity does not satisfy Eq.(12). This means that a gas with a constant heat capacity all the way to absolute zero violates the third law of thermodynamics.

The conflict is solved as follows: At a certain temperature the quantum nature of matter starts to dominate the behavior. Fermi particles follow Fermi–Dirac statistics and Bose particles follow Bose–Einstein statistics. In both cases the heat capacity at low temperatures is no longer temperature independent, even for ideal gases. For Fermi gases
C_V = \frac{ \pi^2}{2}R \frac{T}{T_F} (14)
with the Fermi temperature TF given by
T_F = \frac{1}{8 \pi^2}\frac{N_A^2h^2}{MR}\left( \frac{3\pi^2N_A}{V_m}\right)^{2/3}. (15)

Here NA is Avogadro's number, Vm the molar volume, and M the molar mass.

For Bose gases
C_V=1.93..R\left( \frac{T}{T_B}\right)^{3/2} (16)
with TB given by
T_B = \frac{1}{11.9..}\frac{N_A^2h^2}{MR}\left( \frac{N_A}{V_m}\right)^{2/3}. (17)

The specific heats given by Eq.(14) and (16) both satisfy Eq.(12).

Vapor pressure

The only liquids near absolute zero are ³He and ⁴He. Their heat of evaporation has a limiting value given by
L=L_0+C_pT (18)

with L0 and Cp constant. If we consider a container, partly filled with liquid and partly gas, the entropy of the liquid–gas mixture is

S(T,x) = S_l(T)+x(\frac{L_0}{T}+C_p) (19)

where Sl(T) is the entropy of the liquid and x is the gas fraction. Clearly the entropy change during the liquid–gas transition (x from 0 to 1) diverges in the limit of T→0. This violates Eq.(8). Nature solves this paradox as follows: at temperatures below about 50 mK the vapor pressure is so low that the gas density is lower than the best vacuum in the universe. In other words: below 50 mK there is simply no gas above the liquid.

Latent heat of melting

The melting curves of ³He and ⁴He both extend down to absolute zero at finite pressure. At the melting pressure liquid and solid are in equilibrium. The third law demands that the entropies of the solid and liquid are equal at T=0. As a result the latent heat of melting is zero and the slope of the melting curve extrapolates to zero as a result of the Clausius–Clapeyron equation.

Thermal expansion coefficient

The thermal expansion coefficient is defined as

\alpha_V = \frac{1}{V_m} \left(\frac{\part V_m}{\part T}\right)_{p}. (20)
With the Maxwell relation
\left(\frac{\part V_m}{\part T}\right)_{p}=-\left(\frac{\part S_m}{\part p}\right)_T (21)
and Eq.(8) with X=p it is shown that
\lim_{T \rightarrow 0}\alpha_V=0. (22)

So the thermal expansion coefficient of all materials must go to zero at zero kelvin.

See also


  1. ^ J. Wilks The Third Law of Thermodynamics Oxford University Press (1961).
  2. ^ Kittel and Kroemer, Thermal Physics (2nd ed.), page 49.
  3. ^ Wilks, J. (1971). The Third Law of Thermodynamics, Chapter 6 in Thermodynamics, volume 1, ed. W. Jost, of H. Eyring, D. Henderson, W. Jost, Physical Chemistry. An Advanced Treatise, Academic Press, New York, page 477.
  4. ^ Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics, New York, ISBN 0–88318–797-3, page 342.
  5. ^ Kozliak, Evguenii; Lambert, Frank L. (2008). "Residual Entropy, the Third Law and Latent Heat". Entropy 10 (3): 274–84.  
  6. ^ Reynolds and Perkins (1977). Engineering Thermodynamicsq. McGraw Hill. p. 438.  
  7. ^ Guggenheim, E.A. (1967). Thermodynamics. An Advanced Treatment for Chemists and Physicists, fifth revised edition, North-Holland Publishing Company, Amsterdam, page 157.
  8. ^ F. Pobell, Matter and Methods at Low Temperatures, (Springer-Verlag, Berlin, 2007)
  9. ^ Einstein and the Quantum, A. Douglas Stone, Princeton University Press, 2013.

Further reading

This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.