Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Boltzmann distribution

From Wikipedia, the free encyclopedia
Probability distribution of energy states of a system
This article is about system energy states. For particle energy levels and velocities, seeMaxwell–Boltzmann distribution.

Boltzmann factorpi/pj{\displaystyle p_{i}/p_{j}} (vertical axis) as a function of temperatureT for several energy differencesεiεj.

Instatistical mechanics andmathematics, aBoltzmann distribution (also calledGibbs distribution[1]) is aprobability distribution orprobability measure that gives the probability that a system will be in a certainstate as a function of that state's energy and the temperature of the system. The distribution is expressed in the formpiexp(εikBT),{\displaystyle p_{i}\propto \exp \left(-{\frac {\varepsilon _{i}}{k_{\text{B}}T}}\right),}wherepi is the probability of the system being in statei,exp is theexponential function,εi is the energy of that state, and a constantkBT of the distribution is the product of theBoltzmann constantkB andthermodynamic temperatureT. The symbol{\displaystyle \propto } denotesproportionality (see§ The distribution for the proportionality constant).

The termsystem here has a wide meaning; it can range from a collection of "sufficient number" of atoms or a single atom[1] to a macroscopic system such as anatural-gas storage tank. Therefore, the Boltzmann distribution can be used to solve a wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied.

Theratio of probabilities of two states is known as theBoltzmann factor and characteristically only depends on the states' energy difference:pipj=exp(εjεikBT).{\displaystyle {\frac {p_{i}}{p_{j}}}=\exp \left({\frac {\varepsilon _{j}-\varepsilon _{i}}{k_{\text{B}}T}}\right).}

The Boltzmann distribution is named afterLudwig Boltzmann, who first formulated it in 1868 during his studies of thestatistical mechanics of gases inthermal equilibrium.[2] Boltzmann's statistical work is borne out in his paper "On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium"[3]The distribution was later investigated extensively, in its modern generic form, byJosiah Willard Gibbs in 1902.[4]

The Boltzmann distribution should not be confused with theMaxwell–Boltzmann distribution orMaxwell–Boltzmann statistics. The Boltzmann distribution gives the probability that a system will be in a certainstate as a function of that state's energy,[5] while the Maxwell–Boltzmann distributions give the probabilities of particlespeeds orenergies in ideal gases. The distribution of energies in aone-dimensional gas, however, does follow the Boltzmann distribution.

The distribution

[edit]

The Boltzmann distribution is aprobability distribution that gives the probability of a certain state as a function of that state's energy and temperature of thesystem to which the distribution is applied.[6] It is given aspi=1Qexp(εikBT)=exp(εikBT)j=1Mexp(εjkBT),{\displaystyle p_{i}={\frac {1}{Q}}\exp \left(-{\frac {\varepsilon _{i}}{k_{\text{B}}T}}\right)={\frac {\exp \left(-{\tfrac {\varepsilon _{i}}{k_{\text{B}}T}}\right)}{\displaystyle \sum _{j=1}^{M}\exp \left(-{\tfrac {\varepsilon _{j}}{k_{\text{B}}T}}\right)}},}where

exp() is theexponential function,
pi is the probability of statei,
εi is the energy of statei,
kB is theBoltzmann constant,
T is theabsolute temperature of the system,
M is the number of all states accessible to the system of interest,[6][5]
Q (denoted by some authors byZ) is the normalization denominator, which is thecanonical partition functionQ=j=1Mexp(εjkBT).{\displaystyle Q=\sum _{j=1}^{M}\exp \left(-{\tfrac {\varepsilon _{j}}{k_{\text{B}}T}}\right).} It results from the constraint that the probabilities of all accessible states must add up to 1.

UsingLagrange multipliers, one can prove that the Boltzmann distribution is the distribution that maximizes theentropyS(p1,p2,,pM)=i=1Mpilog2pi,{\displaystyle S(p_{1},p_{2},\dots ,p_{M})=-\sum _{i=1}^{M}p_{i}\log _{2}p_{i},}subject to the normalization constraint thatipi=1{\textstyle \sum _{i}p_{i}=1} and the constraint thatipiεi{\textstyle \sum _{i}p_{i}\varepsilon _{i}} equals a particular mean energy value, except for two special cases. (These special cases occur when the mean value is either the minimum or maximum of the energiesεi. In these cases, the entropy maximizing distribution is a limit of Boltzmann distributions whereT approaches zero from above or below, respectively.)

The partition function can be calculated if we know the energies of the states accessible to the system of interest. For atoms the partition function values can be found in theNIST Atomic Spectra Database.[7]

The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy. It also gives the quantitative relationship between the probabilities of the two states being occupied. The ratio of probabilities for statesi andj is given aspipj=exp(εjεikBT),{\displaystyle {\frac {p_{i}}{p_{j}}}=\exp \left({\frac {\varepsilon _{j}-\varepsilon _{i}}{k_{\text{B}}T}}\right),}where

pi is the probability of statei,
pj the probability of statej,
εi is the energy of statei,
εj is the energy of statej.

The corresponding ratio of populations of energy levels must also take theirdegeneracies into account.

The Boltzmann distribution is often used to describe the distribution of particles, such as atoms or molecules, over bound states accessible to them. For a system consisting of many particles, the probability of a particle being in statei is practically the probability that picking a random particle from that system will find it in statei. This probability is equal to the number of particles in statei divided by the total number of particles in the system, that is the fraction of particles that occupy statei:pi=NiN,{\displaystyle p_{i}={\frac {N_{i}}{N}},}whereNi is the number of particles in statei, andN is the total number of particles in the system. The Boltzmann distribution gives these probability for a system in thermal equilibrium. So the equation that gives the fraction of particles in statei as a function of the energy of that state is[5]NiN=exp(εikBT)j=1Mexp(εjkBT).{\displaystyle {\frac {N_{i}}{N}}={\frac {\exp \left(-{\frac {\varepsilon _{i}}{k_{\text{B}}T}}\right)}{\displaystyle \sum _{j=1}^{M}\exp \left(-{\tfrac {\varepsilon _{j}}{k_{\text{B}}T}}\right)}}.}

This equation is of great importance tospectroscopy. Spectroscopy observesspectral lines of atoms or molecules undergoing transitions from one state to another.[5][8] In order for this to be possible, there must be some particles in the first state to undergo the transition. Their fraction can be estimated from the Boltzmann distribution. If it is negligible, the transition is very likely not observed at the temperature for which the calculation was done. In general, a larger fraction of molecules in the first state means a higher number of transitions to the second state.[9] This gives a stronger spectral line. However, there are other factors that influence the intensity of a spectral line, such as whether it is caused by an allowed or aforbidden transition.

Thesoftmax function commonly used inmachine learning is related to the Boltzmann distribution:(p1,,pM)=softmax[ε1kBT,,εMkBT].{\displaystyle (p_{1},\ldots ,p_{M})=\operatorname {softmax} \left[-{\frac {\varepsilon _{1}}{k_{\text{B}}T}},\ldots ,-{\frac {\varepsilon _{M}}{k_{\text{B}}T}}\right].}

Generalized Boltzmann distribution

[edit]

A distribution of the formPr(ω)exp[η=1nXηxη(ω)kBTE(ω)kBT]{\displaystyle \Pr \left(\omega \right)\propto \exp \left[\sum _{\eta =1}^{n}{\frac {X_{\eta }x_{\eta }^{\left(\omega \right)}}{k_{\text{B}}T}}-{\frac {E^{\left(\omega \right)}}{k_{\text{B}}T}}\right]}is calledgeneralized Boltzmann distribution by some authors.[10]

The Boltzmann distribution is a special case of the generalized Boltzmann distribution. The generalized Boltzmann distribution is used in statistical mechanics to describecanonical ensemble,grand canonical ensemble andisothermal–isobaric ensemble. The generalized Boltzmann distribution is usually derived from theprinciple of maximum entropy, but there are other derivations.[10][11]

The generalized Boltzmann distribution has the following properties:

In statistical mechanics

[edit]
Main articles:Canonical ensemble andMaxwell–Boltzmann statistics

The Boltzmann distribution appears instatistical mechanics when considering closed systems of fixed composition that are inthermal equilibrium (equilibrium with respect to energy exchange). The most general case is the probability distribution for the canonical ensemble. Some special cases (derivable from the canonical ensemble) show the Boltzmann distribution in different aspects:

Canonical ensemble (general case)
Thecanonical ensemble gives theprobabilities of the various possible states of a closed system of fixed volume, in thermal equilibrium with aheat bath. The canonical ensemble has a state probability distribution with the Boltzmann form.
Statistical frequencies of subsystems' states (in a non-interacting collection)
When the system of interest is a collection of many non-interacting copies of a smaller subsystem, it is sometimes useful to find thestatistical frequency of a given subsystem state, among the collection. The canonical ensemble has the property of separability when applied to such a collection: as long as the non-interacting subsystems have fixed composition, then each subsystem's state is independent of the others and is also characterized by a canonical ensemble. As a result, theexpected statistical frequency distribution of subsystem states has the Boltzmann form.
Maxwell–Boltzmann statistics of classical gases (systems of non-interacting particles)
In particle systems, many particles share the same space and regularly change places with each other; the single-particle state space they occupy is a shared space.Maxwell–Boltzmann statistics give the expected number of particles found in a given single-particle state, in aclassical gas of non-interacting particles at equilibrium. This expected number distribution has the Boltzmann form.

Although these cases have strong similarities, it is helpful to distinguish them as they generalize in different ways when the crucial assumptions are changed:

  • When a system is in thermodynamic equilibrium with respect to both energy exchangeand particle exchange, the requirement of fixed composition is relaxed and agrand canonical ensemble is obtained rather than canonical ensemble. On the other hand, if both composition and energy are fixed, then amicrocanonical ensemble applies instead.
  • If the subsystems within a collectiondo interact with each other, then the expected frequencies of subsystem states no longer follow a Boltzmann distribution, and even may not have ananalytical solution.[12] The canonical ensemble can however still be applied to thecollective states of the entire system considered as a whole, provided the entire system is in thermal equilibrium.
  • Withquantum gases of non-interacting particles in equilibrium, the number of particles found in a given single-particle state does not follow Maxwell–Boltzmann statistics, and there is no simple closed form expression for quantum gases in the canonical ensemble. In the grand canonical ensemble the state-filling statistics of quantum gases are described byFermi–Dirac statistics orBose–Einstein statistics, depending on whether the particles arefermions orbosons, respectively.

In mathematics

[edit]
Main articles:Gibbs measure,Log-linear model, andBoltzmann machine

In economics

[edit]

The Boltzmann distribution can be introduced to allocate permits inemissions trading.[13][14] A new allocation method, known as theBoltzmann Fair Division, uses the Boltzmann distribution to describe the most probable, natural, and unbiased distribution of emission permits among multiple countries.[15] This framework has been further extended to address general problems of distributive justice, including cake-cutting and resource allocation, by allowing flexibility in how factors such as contribution, need, or preference are weighted. The Boltzmann fair division is recognized for providing a simple yet powerful probabilistic model that can be adapted to various social, political, and economic contexts.[15]

The Boltzmann distribution has the same form as themultinomial logit model. As adiscrete choice model, this is very well known in economics sinceDaniel McFadden made the connection to random utility maximization.[16]

See also

[edit]

References

[edit]
  1. ^abLandau, Lev Davidovich &Lifshitz, Evgeny Mikhailovich (1980) [1976].Statistical Physics. Course of Theoretical Physics. Vol. 5 (3 ed.). Oxford: Pergamon Press.ISBN 0-7506-3372-7. Translated by J. B. Sykes and M. J. Kearsley. See section 28.
  2. ^Boltzmann, Ludwig (1868). "Studien über das Gleichgewicht der lebendigen Kraft zwischen bewegten materiellen Punkten" [Studies on the balance of living force between moving material points].Wiener Berichte (in German).58:517–560.
  3. ^"Translation of Ludwig Boltzmann's Paper "On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium""(PDF). Archived fromthe original(PDF) on 2020-10-21. Retrieved2017-05-11.
  4. ^Gibbs, Josiah Willard (1902).Elementary Principles in Statistical Mechanics. New York:Charles Scribner's Sons.
  5. ^abcdAtkins, P. W. (2010).Quanta. New York: W. H. Freeman and Company.
  6. ^abMcQuarrie, A. (2000).Statistical Mechanics. Sausalito, CA: University Science Books.ISBN 1-891389-15-7.
  7. ^NIST Atomic Spectra Database Levels Form at nist.gov.
  8. ^Atkins, P. W.; de Paula, J. (2009).Physical Chemistry (9th ed.). Oxford: Oxford University Press.ISBN 978-0-19-954337-3.
  9. ^Skoog, D. A.; Holler, F. J.; Crouch, S. R. (2006).Principles of Instrumental Analysis. Boston, MA: Brooks/Cole.ISBN 978-0-495-12570-9.
  10. ^abcGao, Xiang; Gallicchio, Emilio; Roitberg, Adrian (2019)."The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy".The Journal of Chemical Physics.151 (3): 034113.arXiv:1903.02121.Bibcode:2019JChPh.151c4113G.doi:10.1063/1.5111333.PMID 31325924.S2CID 118981017.
  11. ^abGao, Xiang (March 2022)."The Mathematics of the Ensemble Theory".Results in Physics.34 105230.arXiv:2006.00485.Bibcode:2022ResPh..3405230G.doi:10.1016/j.rinp.2022.105230.S2CID 221978379.
  12. ^A classic example of this ismagnetic ordering. Systems of non-interactingspins showparamagnetic behaviour that can be understood with a single-particle canonical ensemble (resulting in theBrillouin function). Systems ofinteracting spins can show much more complex behaviour such asferromagnetism orantiferromagnetism.
  13. ^Park, J.-W., Kim, C. U., & Isard, W. (2012). Permit allocation in emissions trading using the Boltzmann distribution. Physica A, 391, 4883–4890.
  14. ^The Thorny Problem Of Fair Allocation.Technology Review blog. August 17, 2011. Cites and summarizes Park, Kim, and Isard (2012).
  15. ^abPark, J.-W., Kim, J.U., Ghim, C.-M., & Kim, C.U. (2022). The Boltzmann fair division for distributive justice.Scientific Reports, 12, 16179.doi:10.1038/s41598-022-16179-1
  16. ^Amemiya, Takeshi (1985)."Multinomial Logit Model".Advanced Econometrics. Oxford: Basil Blackwell. pp. 295–299.ISBN 0-631-13345-3.
Discrete
univariate
with finite
support
with infinite
support
Continuous
univariate
supported on a
bounded interval
supported on a
semi-infinite
interval
supported
on the whole
real line
with support
whose type varies
Mixed
univariate
continuous-
discrete
Multivariate
(joint)
Directional
Degenerate
andsingular
Degenerate
Dirac delta function
Singular
Cantor
Families
Retrieved from "https://en.wikipedia.org/w/index.php?title=Boltzmann_distribution&oldid=1337270286"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp