Movatterモバイル変換


[0]ホーム

URL:


Lompat ke isi
WikipédiaÉnsiklopédi Bébas
Paluruh

Ranté Markov

Ti Wikipédia Sunda, énsiklopédi bébas

Dinamatematik,ranté Markov nyaétaprosés stokastik nu ngagunakeunMarkov property. Salaku prosés, jarak ti heula taya hubunganana jeung jarak ayeuna di dipikanyaho.

Dina kasus discrete-time, prosés mibanda sekuenX1, X2, X3, ... tinavariabel acak.Domain tina variabel ieu disebuttetapan ruang, nu mibanda nilaiXn salila dina waktu nu ditangtukeunn.Lamun conditional distributionXn+1 dina tetapan ti heula salaku fungsiXn sorangan,

P(Xn+1|X0,X1,X2,,Xn)=P(Xn+1|Xn){\displaystyle P(X_{n+1}|X_{0},X_{1},X_{2},\ldots ,X_{n})=P(X_{n+1}|X_{n})}

saterusna prosés ieu disebut mibandasipat Markov.

Ranté Markov aya sanggeusA.A. Markov, nu mimiti ngahasilkeun ieu prosés dina taun (1906). Dijadikeun bisa diitung keur tetapan dina ruang anu "tidak tebatas" kuKolmogorov (1936).Ranté Markov patali jeunggerak Brown sartahipotesa ergodik, dua topik penting dina widang fisik di awalabad kaduapuluh,tapi Markov digunakeun ogé di luar widang matematika, sapertihukum wilangan gedé dina kajadian anu pakait.

Sifat ranté Markov

[édit |édit sumber]
Artikel ieu keur dikeureuyeuh,ditarjamahkeun tinabasa Inggris.
Bantuanna didagoan pikeunnarjamahkeun.

Ranté Markov dicirikeun ku conditional distribution

P(Xn+1|Xn){\displaystyle P(X_{n+1}|X_{n})\,}

nu disebut proséstransition probability.Kadangkala disebut ogé "one-step" transition probability.Transition probability dua tahap, tilu tahap atawa leuwih dijéntrékeun tinatransition probability satahap sarta sifat Markov:

P(Xn+2|Xn)=P(Xn+2,Xn+1|Xn)dXn+1=P(Xn+2|Xn+1)P(Xn+1|Xn)dXn+1{\displaystyle P(X_{n+2}|X_{n})=\int P(X_{n+2},X_{n+1}|X_{n})\,dX_{n+1}=\int P(X_{n+2}|X_{n+1})\,P(X_{n+1}|X_{n})\,dX_{n+1}}

Saperti,

P(Xn+3|Xn)=P(Xn+3|Xn+2)P(Xn+2|Xn+1)P(Xn+1|Xn)dXn+1dXn+2{\displaystyle P(X_{n+3}|X_{n})=\int P(X_{n+3}|X_{n+2})\int P(X_{n+2}|X_{n+1})\,P(X_{n+1}|X_{n})\,dX_{n+1}\,dX_{n+2}}

Rumus ieu nyaruakeun keur kayaan waktu nu teuteratur di mangsa datangn+k ku cara ngalikeun transition probabilities sarta nga-integral-keun waktuk.

Marginal distributionP(Xn) nyaéta distribusi nu ditangtukeun dina waktun.Distiribusi awal nyaétaP(X0).Evolusi prosés ngaliwatan sakali tahap waktu nu dijéntrékeun ku

P(Xn+1)=P(Xn+1|Xn)P(Xn)dXn{\displaystyle P(X_{n+1})=\int P(X_{n+1}|X_{n})\,P(X_{n})\,dX_{n}}

Ieu mangrupa versiFrobenius-Perron equation.di dinya aya hiji atawa leuwihtetapan distribusi π saperti

π(X)=P(X|Y)π(Y)dY{\displaystyle \pi (X)=\int P(X|Y)\,\pi (Y)\,dY}

nu manaY ngan sakadar ngaran variabel integrasi.Saperti distribution π disebutstationary distribution atawasteady-state distribution.Stationary distribution nyaétaeigenfunction tina fungsiconditional distribution, nu pakait jeungeigenvalue 1.

Whether or not there is a stationary distribution,and whether or not it is unique if it does exist,are determined by certain properties of the process.Irreducible méans that every state is accessible from every other state.Aperiodic méans that there exists at léast one state for which the transition from that state to itself is possible.Positive recurrent méans that the expected return time is finite for every state.Sometimes the termsindecomposable,acyclic, andpersistent are used as synonyms for "irreducible", "aperiodic", and "recurrent", respectively.

If the Markov chain is positive recurrent,there exists a stationary distribution.If it is positive recurrent and irreducible,there exists a unique stationary distribution,and furthermore the process constructed by taking the stationary distribution as the initial distribution isergodic.Then the average of a functionf over samples of the Markov chain is equal to the average with respect to the stationary distribution,

limn1nk=0n1f(Xk)=f(X)π(X)dX{\displaystyle \lim _{n\rightarrow \infty }\;{\frac {1}{n}}\;\sum _{k=0}^{n-1}f(X_{k})=\int f(X)\,\pi (X)\,dX}

In particular,this holds forf equal to the identity function.Mangka nilai average sampel dina waktu nyaéta sarua jeungnilai ekspektasi tina sebaranstationary.

Furthermore,the equivalence of averages also holds iff is theindicator function on some subsetA of the state space.

limn1nk=0n1χA(Xk)=Aπ(X)dX=μπ(A){\displaystyle \lim _{n\rightarrow \infty }\;{\frac {1}{n}}\;\sum _{k=0}^{n-1}\chi _{A}(X_{k})=\int _{A}\pi (X)\,dX=\mu _{\pi }(A)}

where μπ is the méasure induced by π.This makes it possible to approximate the stationary distribution by ahistogram or other density estimate of a sequence of samples.

Markov chains dina ruang diskrit state

[édit |édit sumber]

If the state space isfinite,the transition probability distribution can be represented as a matrix,called thetransition matrix,with the (i,j)'th element equal to

P(Xn+1=iXn=j){\displaystyle P(X_{n+1}=i\mid X_{n}=j)}

(In this formulation, element (i,j) is the probability of a transition fromj toi.An equivalent formulation is sometimes given with element (i,j) equal to the probability of a transition fromi toj.In that case the transition matrix is just thetranspose of the one given here.)

For a discrete state space,the integrations in thek-step transition probability are summations,and can be computed as thek'th power of the transition matrix.That is, ifP is the one-step transition matrix, thenPk is the transition matrix for thek-step transition.

WritingP for the transition matrix,a stationary distribution is a vector which satisfies the equation

Pπ=π{\displaystyle P\pi =\pi \,}

In this case, the stationary distribution is aneigenvector of the transition matrix,associated with theeigenvalue 1.If the transition matrixP is positive recurrent, irreducible, and aperiodic,thenPk converges elementwise to a matrix in which éach column is the unique stationary distribution.

A transition matrix which is positive (that is, every element of the matrix is positive)is irreducible, aperiodic, and positive recurrent.A matrix is astochastic matrix if and only if it is the matrix of transition probabilities of some Markov chain.

Scientific applications

[édit |édit sumber]

Markov chains are used to modél various processes inqueueing theory andstatistics, and can also be used as a signal modél inentropy coding techniques such asarithmetic coding. Markov chains also have many biological applications, particularlypopulation processes, which are useful in modélling processes that are (at léast) analogous to biological populations. Markov chains have been used inbioinformatics as well. An example is thegenemark algorithm for coding region/gene prediction.

Markov processes can also be used to generate superficially "real-looking" text given a sample document: they are used in various pieces of recréational "parody generator" software (seeJeff Harrison).

Tempo ogé

[édit |édit sumber]

Rujukan

[édit |édit sumber]
  • A.A. Markov. "Rasprostranenie zakona bol'shih chisel na velichiny, zavisyaschie drug ot druga".Izvestiya Fiziko-matematicheskogo obschestva pri Kazanskom universitete, 2-ya seriya, tom 15, pp 135–156, 1906.
  • A.A. Markov. "Extension of the limit theorems of probability theory to a sum of variables connected in a chain". reprinted in Appendix B of: R. Howard.Dynamic Probabilistic Systems, volume 1: Markov Chains. John Wiley and Sons, 1971.
  • Léo Breiman.Probability. Original edition published by Addison-Wesley, 1968; reprinted by Society for Industrial and Applied Mathematics, 1992.ISBN 0-89871-296-3.(See Chapter 7.)
  • J.L. Doob.Stochastic Processes. New York: John Wiley and Sons, 1953.ISBN 0-471-52369-0.

Tumbu kaluar

[édit |édit sumber]
Dicomot ti "https://su.wikipedia.org/w/index.php?title=Ranté_Markov&oldid=654461"
Kategori:

[8]ページ先頭

©2009-2025 Movatter.jp