Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Continuous mapping theorem

From Wikipedia, the free encyclopedia
Probability theorem
Not to be confused with thecontraction mapping theorem.

Inprobability theory, thecontinuous mapping theorem states that continuous functionspreserve limits even if their arguments are sequences of random variables. A continuous function, inHeine's definition, is such a function that maps convergent sequences into convergent sequences: ifxnx theng(xn) →g(x). Thecontinuous mapping theorem states that this will also be true if we replace the deterministic sequence {xn} with a sequence of random variables {Xn}, and replace the standard notion of convergence of real numbers “→” with one of the types ofconvergence of random variables.

This theorem was first proved byHenry Mann andAbraham Wald in 1943,[1] and it is therefore sometimes called theMann–Wald theorem.[2] Meanwhile,Denis Sargan refers to it as thegeneral transformation theorem.[3]

Statement

[edit]

Let {Xn},X berandom elements defined on ametric spaceS. Suppose a functiong:SS′ (whereS′ is another metric space) has the set ofdiscontinuity pointsDg such thatPr[X ∈ Dg] = 0. Then[4][5]

Xn d Xg(Xn) d g(X);Xn p Xg(Xn) p g(X);Xn a.s. Xg(Xn) a.s. g(X).{\displaystyle {\begin{aligned}X_{n}\ {\xrightarrow {\text{d}}}\ X\quad &\Rightarrow \quad g(X_{n})\ {\xrightarrow {\text{d}}}\ g(X);\\[6pt]X_{n}\ {\xrightarrow {\text{p}}}\ X\quad &\Rightarrow \quad g(X_{n})\ {\xrightarrow {\text{p}}}\ g(X);\\[6pt]X_{n}\ {\xrightarrow {\!\!{\text{a.s.}}\!\!}}\ X\quad &\Rightarrow \quad g(X_{n})\ {\xrightarrow {\!\!{\text{a.s.}}\!\!}}\ g(X).\end{aligned}}}

where the superscripts, "d", "p", and "a.s." denoteconvergence in distribution,convergence in probability, andalmost sure convergence respectively.

Proof

[edit]
This proof has been adopted from (van der Vaart 1998, Theorem 2.3)

SpacesS andS′ are equipped with certain metrics. For simplicity we will denote both of these metrics using the |x − y| notation, even though the metrics may be arbitrary and not necessarily Euclidean.

Convergence in distribution

[edit]

We will need a particular statement from theportmanteau theorem: that convergence in distributionXndX{\displaystyle X_{n}{\xrightarrow {d}}X} is equivalent to

Ef(Xn)Ef(X){\displaystyle \mathbb {E} f(X_{n})\to \mathbb {E} f(X)} for every bounded continuous functionalf.

So it suffices to prove thatEf(g(Xn))Ef(g(X)){\displaystyle \mathbb {E} f(g(X_{n}))\to \mathbb {E} f(g(X))} for every bounded continuous functionalf. For simplicity we assumeg continuous. Note thatF=fg{\displaystyle F=f\circ g} is itself a bounded continuous functional. And so the claim follows from the statement above. The general case is slightly more technical.

Convergence in probability

[edit]

Fix an arbitraryε > 0. Then for anyδ > 0 consider the setBδ defined as

Bδ={xSxDg: yS: |xy|<δ,|g(x)g(y)|>ε}.{\displaystyle B_{\delta }={\big \{}x\in S\mid x\notin D_{g}:\ \exists y\in S:\ |x-y|<\delta ,\,|g(x)-g(y)|>\varepsilon {\big \}}.}

This is the set of continuity pointsx of the functiong(·) for which it is possible to find, within theδ-neighborhood ofx, a point which maps outside theε-neighborhood ofg(x). By definition of continuity, this set shrinks asδ goes to zero, so that limδ → 0Bδ = ∅.

Now suppose that |g(X) − g(Xn)| > ε. This implies that at least one of the following is true: either |XXn| ≥ δ, orX ∈ Dg, orXBδ. In terms of probabilities this can be written as

Pr(|g(Xn)g(X)|>ε)Pr(|XnX|δ)+Pr(XBδ)+Pr(XDg).{\displaystyle \Pr {\big (}{\big |}g(X_{n})-g(X){\big |}>\varepsilon {\big )}\leq \Pr {\big (}|X_{n}-X|\geq \delta {\big )}+\Pr(X\in B_{\delta })+\Pr(X\in D_{g}).}

On the right-hand side, the first term converges to zero asn → ∞ for any fixedδ, by the definition of convergence in probability of the sequence {Xn}. The second term converges to zero asδ → 0, since the setBδ shrinks to an empty set. And the last term is identically equal to zero by assumption of the theorem. Therefore, the conclusion is that

limnPr(|g(Xn)g(X)|>ε)=0,{\displaystyle \lim _{n\to \infty }\Pr {\big (}{\big |}g(X_{n})-g(X){\big |}>\varepsilon {\big )}=0,}

which means thatg(Xn) converges tog(X) in probability.

Almost sure convergence

[edit]

By definition of the continuity of the functiong(·),

limnXn(ω)=X(ω)limng(Xn(ω))=g(X(ω)){\displaystyle \lim _{n\to \infty }X_{n}(\omega )=X(\omega )\quad \Rightarrow \quad \lim _{n\to \infty }g(X_{n}(\omega ))=g(X(\omega ))}

at each pointX(ω) whereg(·) is continuous. Therefore,

Pr(limng(Xn)=g(X))Pr(limng(Xn)=g(X), XDg)Pr(limnXn=X, XDg)=1,{\displaystyle {\begin{aligned}\Pr \left(\lim _{n\to \infty }g(X_{n})=g(X)\right)&\geq \Pr \left(\lim _{n\to \infty }g(X_{n})=g(X),\ X\notin D_{g}\right)\\&\geq \Pr \left(\lim _{n\to \infty }X_{n}=X,\ X\notin D_{g}\right)=1,\end{aligned}}}

because the intersection of two almost sure events is almost sure.

By definition, we conclude thatg(Xn) converges tog(X) almost surely.

See also

[edit]

References

[edit]
  1. ^Mann, H. B.; Wald, A. (1943)."On Stochastic Limit and Order Relationships".Annals of Mathematical Statistics.14 (3):217–226.doi:10.1214/aoms/1177731415.JSTOR 2235800.
  2. ^Amemiya, Takeshi (1985).Advanced Econometrics. Cambridge, MA: Harvard University Press. p. 88.ISBN 0-674-00560-0.
  3. ^Sargan, Denis (1988).Lectures on Advanced Econometric Theory. Oxford: Basil Blackwell. pp. 4–8.ISBN 0-631-14956-2.
  4. ^Billingsley, Patrick (1969).Convergence of Probability Measures. John Wiley & Sons. p. 31 (Corollary 1).ISBN 0-471-07242-7.
  5. ^van der Vaart, A. W. (1998).Asymptotic Statistics. New York: Cambridge University Press. p. 7 (Theorem 2.3).ISBN 0-521-49603-9.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Continuous_mapping_theorem&oldid=1285525708"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp