Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Donsker's theorem

From Wikipedia, the free encyclopedia
Statement in probability theory
Donsker's invariance principle forsimple random walk onZ{\displaystyle \mathbb {Z} }.

Inprobability theory,Donsker's theorem (also known asDonsker's invariance principle, or thefunctional central limit theorem), named afterMonroe D. Donsker, is a functional extension of thecentral limit theorem for empirical distribution functions. Specifically, the theorem states that an appropriately centered and scaled version of the empirical distribution function converges to aGaussian process.

LetX1,X2,X3,{\displaystyle X_{1},X_{2},X_{3},\ldots } be a sequence ofindependent and identically distributed (i.i.d.)random variables with mean 0 and variance 1. LetSn:=i=1nXi{\displaystyle S_{n}:=\sum _{i=1}^{n}X_{i}}. The stochastic processS:=(Sn)nN{\displaystyle S:=(S_{n})_{n\in \mathbb {N} }} is known as arandom walk. Define the diffusively rescaled random walk (partial-sum process) by

W(n)(t):=Sntn,t[0,1].{\displaystyle W^{(n)}(t):={\frac {S_{\lfloor nt\rfloor }}{\sqrt {n}}},\qquad t\in [0,1].}

Thecentral limit theorem asserts thatW(n)(1){\displaystyle W^{(n)}(1)}converges in distribution to a standardGaussian random variableW(1){\displaystyle W(1)} asn{\displaystyle n\to \infty }. Donsker's invariance principle[1][2] extends this convergence to the whole functionW(n):=(W(n)(t))t[0,1]{\displaystyle W^{(n)}:=(W^{(n)}(t))_{t\in [0,1]}}. More precisely, in its modern form, Donsker's invariance principle states that: Asrandom variables taking values in theSkorokhod spaceD[0,1]{\displaystyle {\mathcal {D}}[0,1]}, the random functionW(n){\displaystyle W^{(n)}}converges in distribution to astandard Brownian motionW:=(W(t))t[0,1]{\displaystyle W:=(W(t))_{t\in [0,1]}} asn.{\displaystyle n\to \infty .}

Donsker-Skorokhod-Kolmogorov theorem for uniform distributions.
Donsker-Skorokhod-Kolmogorov theorem for normal distributions

Formal statement

[edit]

LetFn be theempirical distribution function of the sequence of i.i.d. random variablesX1,X2,X3,{\displaystyle X_{1},X_{2},X_{3},\ldots } with distribution functionF. Define the centered and scaled version ofFn by

Gn(x)=n(Fn(x)F(x)){\displaystyle G_{n}(x)={\sqrt {n}}(F_{n}(x)-F(x))}

indexed byx ∈ R. By the classicalcentral limit theorem, for fixedx, the random variableGn(x)converges in distribution to aGaussian (normal)random variableG(x) with zero mean and varianceF(x)(1 − F(x)) as the sample sizen grows.

Theorem (Donsker, Skorokhod, Kolmogorov) The sequence ofGn(x), as random elements of theSkorokhod spaceD(,){\displaystyle {\mathcal {D}}(-\infty ,\infty )},converges in distribution to aGaussian processG with zero mean and covariance given by

cov[G(s),G(t)]=E[G(s)G(t)]=min{F(s),F(t)}F(s){\displaystyle \operatorname {cov} [G(s),G(t)]=E[G(s)G(t)]=\min\{F(s),F(t)\}-F(s)}F(t).{\displaystyle {F}(t).}

The processG(x) can be written asB(F(x)) whereB is a standardBrownian bridge on theunit interval.

Proof sketch

[edit]

For continuous probability distributions, it reduces to the case where the distribution is uniform on[0,1]{\displaystyle [0,1]} by theinverse transform.

Given any finite sequence of times0<t1<t2<<tn<1{\displaystyle 0<t_{1}<t_{2}<\dots <t_{n}<1}, we have thatNFN(t1){\displaystyle NF_{N}(t_{1})} is distributed as abinomial distribution with meanNt1{\displaystyle Nt_{1}} and varianceNt1(1t1){\displaystyle Nt_{1}(1-t_{1})}.

Similarly, the joint distribution ofFN(t1),FN(t2),,FN(tn){\displaystyle F_{N}(t_{1}),F_{N}(t_{2}),\dots ,F_{N}(t_{n})} is a multinomial distribution. Now, thecentral limit approximation for multinomial distributions shows thatlimNN(FN(ti)ti){\displaystyle \lim _{N}{\sqrt {N}}(F_{N}(t_{i})-t_{i})} converges in distribution to a gaussian process with covariance matrix with entriesmin(ti,tj)titj{\displaystyle \min(t_{i},t_{j})-t_{i}t_{j}}, which is precisely the covariance matrix for the Brownian bridge.

History and related results

[edit]

Kolmogorov (1933) showed that whenF iscontinuous, the supremumsuptGn(t){\displaystyle \scriptstyle \sup _{t}G_{n}(t)} and supremum of absolute value,supt|Gn(t)|{\displaystyle \scriptstyle \sup _{t}|G_{n}(t)|}converges in distribution to the laws of the same functionals of theBrownian bridgeB(t), see theKolmogorov–Smirnov test. In 1949 Doob asked whether the convergence in distribution held for more general functionals, thus formulating a problem ofweak convergence of random functions in a suitablefunction space.[3]

In 1952 Donsker stated and proved (not quite correctly)[4] a general extension for the Doob–Kolmogorov heuristic approach. In the original paper, Donsker proved that the convergence in law ofGn to the Brownian bridge holds forUniform[0,1] distributions with respect touniform convergence int over the interval [0,1].[2]

However Donsker's formulation was not quite correct because of the problem of measurability of the functionals of discontinuous processes. In 1956 Skorokhod and Kolmogorov defined a separable metricd, called theSkorokhod metric, on the space ofcàdlàg functions on [0,1], such that convergence ford to a continuous function is equivalent to convergence for the sup norm, and showed thatGn converges in law inD[0,1]{\displaystyle {\mathcal {D}}[0,1]} to the Brownian bridge.

Later Dudley reformulated Donsker's result to avoid the problem of measurability and the need of the Skorokhod metric. One can prove[4] that there existXi, iid uniform in [0,1] and a sequence of sample-continuous Brownian bridgesBn, such that

GnBn{\displaystyle \|G_{n}-B_{n}\|_{\infty }}

is measurable andconverges in probability to 0. An improved version of this result, providing more detail on therate of convergence, is theKomlós–Major–Tusnády approximation.

See also

[edit]

References

[edit]
  1. ^Donsker, M.D. (1951). "An invariance principle for certain probability limit theorems".Memoirs of the American Mathematical Society (6).MR 0040613.
  2. ^abDonsker, M. D. (1952)."Justification and extension of Doob's heuristic approach to the Kolmogorov–Smirnov theorems".Annals of Mathematical Statistics.23 (2):277–281.doi:10.1214/aoms/1177729445.MR 0047288.Zbl 0046.35103.
  3. ^Doob, Joseph L. (1949)."Heuristic approach to the Kolmogorov–Smirnov theorems".Annals of Mathematical Statistics.20 (3):393–403.doi:10.1214/aoms/1177729991.MR 0030732.Zbl 0035.08901.
  4. ^abDudley, R.M. (1999).Uniform Central Limit Theorems. Cambridge University Press.ISBN 978-0-521-46102-3.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Donsker%27s_theorem&oldid=1300264159"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp