Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Information geometry

From Wikipedia, the free encyclopedia
Technique in statistics
The set of all normal distributions forms a statistical manifold withhyperbolic geometry.

Information geometry is an interdisciplinary field that applies the techniques ofdifferential geometry to studyprobability theory andstatistics.[1] It studiesstatistical manifolds, which areRiemannian manifolds whose points correspond toprobability distributions.

Introduction

[edit]
icon
This articlemay need to be rewritten to comply with Wikipedia'squality standards, as This should be edited to include some statistical background.. Relevant discussion may be found on thetalk page.You can help. The talk page may contain suggestions.(April 2019)

Historically, information geometry can be traced back to the work ofC. R. Rao, who was the first to treat theFisher matrix as aRiemannian metric.[2][3] The modern theory is largely due toShun'ichi Amari, whose work has been greatly influential on the development of the field.[4]

Classically, information geometry considered a parametrizedstatistical model as aRiemannian, conjugate connection, statistical, and dually flat manifolds. Unlike usual smooth manifolds with tensor metric and Levi-Civita connection, these take into account conjugate connection, torsion, and Amari-Chentsov metric.[5] All presented above geometric structures find application ininformation theory andmachine learning. For such models, there is a natural choice of Riemannian metric, known as theFisher information metric. In the special case that the statistical model is anexponential family, it is possible to induce the statistical manifold with a Hessian metric (i.e a Riemannian metric given by the potential of a convex function). In this case, the manifold naturally inherits two flataffine connections, as well as a canonicalBregman divergence. Historically, much of the work was devoted to studying the associated geometry of these examples. In the modern setting, information geometry applies to a much wider context, including non-exponential families,nonparametric statistics, and even abstract statistical manifolds not induced from a known statistical model. The results combine techniques frominformation theory,affine differential geometry,convex analysis and many other fields. One of the most perspective information geometry approaches find applications inmachine learning. For example, the developing of information-geometric optimization methods (mirror descent[6] and natural gradient descent[7]).

The standard references in the field are Shun’ichi Amari and Hiroshi Nagaoka's book,Methods of Information Geometry,[8] and the more recent book by Nihat Ay and others.[9] A gentle introduction is given in the survey by Frank Nielsen.[10] In 2018, the journalInformation Geometry was released, which is devoted to the field.

Contributors

[edit]
icon
This sectionmay need to be rewritten to comply with Wikipedia'squality standards, as Does this list make sense in a general article?. Relevant discussion may be found on thetalk page.You can help. The talk page may contain suggestions.(May 2013)

The history of information geometry is associated with the discoveries of at least the following people, and many others.

Applications

[edit]
icon
This sectionmay need to be rewritten to comply with Wikipedia'squality standards, as References should be provided for these.. Relevant discussion may be found on thetalk page.You can help. The talk page may contain suggestions.(April 2019)

As an interdisciplinary field, information geometry has been used in various applications.

Here an incomplete list:

  • Statistical inference[11]
  • Time series and linear systems
  • Filtering problem[12]
  • Quantum systems[13]
  • Neural networks[14]
  • Machine learning
  • Statistical mechanics
  • Biology
  • Statistics[15][16]
  • Mathematical finance[17]

See also

[edit]

References

[edit]
  1. ^Nielsen, Frank (2022)."The Many Faces of Information Geometry"(PDF).Notices of the AMS.69 (1). American Mathematical Society: 36-45.
  2. ^Rao, C. R. (1945). "Information and Accuracy Attainable in the Estimation of Statistical Parameters".Bulletin of the Calcutta Mathematical Society.37:81–91. Reprinted inBreakthroughs in Statistics. Springer. 1992. pp. 235–247.doi:10.1007/978-1-4612-0919-5_16.S2CID 117034671.
  3. ^Nielsen, F. (2013). "Cramér-Rao Lower Bound and Information Geometry". In Bhatia, R.; Rajan, C. S. (eds.).Connected at Infinity II: On the Work of Indian Mathematicians. Texts and Readings in Mathematics. Vol. Special Volume of Texts and Readings in Mathematics (TRIM). Hindustan Book Agency. pp. 18–37.arXiv:1301.3578.doi:10.1007/978-93-86279-56-9_2.ISBN 978-93-80250-51-9.S2CID 16759683.
  4. ^Amari, Shun'ichi (1983)."A foundation of information geometry".Electronics and Communications in Japan.66 (6):1–10.doi:10.1002/ecja.4400660602.
  5. ^Bauer, Martin; Le Brigant, Alice; Lu, Yuxiu; Maor, Cy (2024-02-10)."The $$L^p$$-Fisher–Rao metric and Amari–C̆encov $$\alpha $$-Connections".Calculus of Variations and Partial Differential Equations.63 (2): 56.arXiv:2306.14533.doi:10.1007/s00526-024-02660-5.ISSN 1432-0835.
  6. ^Raskutti, Garvesh; Mukherjee, Sayan (March 2015). "The Information Geometry of Mirror Descent".IEEE Transactions on Information Theory.61 (3):1451–1457.arXiv:1310.7780.Bibcode:2015ITIT...61.1451R.doi:10.1109/TIT.2015.2388583.ISSN 0018-9448.
  7. ^Abdulkadirov, Ruslan; Lyakhov, Pavel; Nagornov, Nikolay (January 2022)."Accelerating Extreme Search of Multidimensional Functions Based on Natural Gradient Descent with Dirichlet Distributions".Mathematics.10 (19): 3556.doi:10.3390/math10193556.ISSN 2227-7390.
  8. ^Amari, Shun'ichi; Nagaoka, Hiroshi (2000).Methods of Information Geometry. Translations of Mathematical Monographs. Vol. 191. American Mathematical Society.ISBN 0-8218-0531-2.
  9. ^Ay, Nihat;Jost, Jürgen; Lê, Hông Vân; Schwachhöfer, Lorenz (2017).Information Geometry. Ergebnisse der Mathematik und ihrer Grenzgebiete. Vol. 64. Springer.ISBN 978-3-319-56477-7.
  10. ^Nielsen, Frank (2018)."An Elementary Introduction to Information Geometry".Entropy.22 (10).
  11. ^Kass, R. E.; Vos, P. W. (1997).Geometrical Foundations of Asymptotic Inference. Series in Probability and Statistics. Wiley.ISBN 0-471-82668-5.
  12. ^Brigo, Damiano; Hanzon, Bernard; LeGland, Francois (1998)."A differential geometric approach to nonlinear filtering: the projection filter"(PDF).IEEE Transactions on Automatic Control.43 (2):247–252.Bibcode:1998ITAC...43..247B.doi:10.1109/9.661075.
  13. ^van Handel, Ramon; Mabuchi, Hideo (2005). "Quantum projection filter for a highly nonlinear model in cavity QED".Journal of Optics B: Quantum and Semiclassical Optics.7 (10):S226–S236.arXiv:quant-ph/0503222.Bibcode:2005JOptB...7S.226V.doi:10.1088/1464-4266/7/10/005.S2CID 15292186.
  14. ^Zlochin, Mark; Baram, Yoram (2001)."Manifold Stochastic Dynamics for Bayesian Learning".Neural Computation.13 (11):2549–2572.doi:10.1162/089976601753196021.PMID 11674851.
  15. ^Amari, Shun'ichi (1985).Differential-Geometrical Methods in Statistics. Lecture Notes in Statistics. Berlin: Springer-Verlag.ISBN 0-387-96056-2.
  16. ^Murray, M.; Rice, J. (1993).Differential Geometry and Statistics. Monographs on Statistics and Applied Probability. Vol. 48.Chapman and Hall.ISBN 0-412-39860-5.
  17. ^Marriott, Paul; Salmon, Mark, eds. (2000).Applications of Differential Geometry to Econometrics. Cambridge University Press.ISBN 0-521-65116-6.

External links

[edit]
Differentiable computing
General
Hardware
Software libraries
Retrieved from "https://en.wikipedia.org/w/index.php?title=Information_geometry&oldid=1313959483"
Category:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp