Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Cumulative distribution function

From Wikipedia, the free encyclopedia
Probability that random variable X is less than or equal to x
icon
This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Cumulative distribution function" – news ·newspapers ·books ·scholar ·JSTOR
(March 2010) (Learn how and when to remove this message)
Cumulative distribution function for theexponential distribution
Cumulative distribution function for thenormal distribution

Inprobability theory andstatistics, thecumulative distribution function (CDF) of a real-valuedrandom variableX{\displaystyle X}, or justdistribution function ofX{\displaystyle X}, evaluated atx{\displaystyle x}, is theprobability thatX{\displaystyle X} will take a value less than or equal tox{\displaystyle x}.[1]

Everyprobability distributionsupported on the real numbers, discrete or "mixed" as well ascontinuous, is uniquely identified by aright-continuousmonotone increasing function (acàdlàg function)F:R[0,1]{\displaystyle F\colon \mathbb {R} \rightarrow [0,1]} satisfyinglimxF(x)=0{\displaystyle \lim _{x\rightarrow -\infty }F(x)=0} andlimxF(x)=1{\displaystyle \lim _{x\rightarrow \infty }F(x)=1}.

In the case of a scalarcontinuous distribution, it gives the area under theprobability density function from negative infinity tox{\displaystyle x}. Cumulative distribution functions are also used to specify the distribution ofmultivariate random variables.

Definition

[edit]

The cumulative distribution function of a real-valuedrandom variableX{\displaystyle X} is the function given by[2]: 77 

FX(x)=P(Xx){\displaystyle F_{X}(x)=\operatorname {P} (X\leq x)}   (Eq.1)

where the right-hand side represents theprobability that the random variableX{\displaystyle X} takes on a value less than or equal tox{\displaystyle x}.

The probability thatX{\displaystyle X} lies in the semi-closedinterval(a,b]{\displaystyle (a,b]}, wherea<b{\displaystyle a<b}, is therefore[2]: 84 

P(a<Xb)=FX(b)FX(a){\displaystyle \operatorname {P} (a<X\leq b)=F_{X}(b)-F_{X}(a)}   (Eq.2)

In the definition above, the "less than or equal to" sign, "≤", is a convention, not a universally used one (e.g. Hungarian literature uses "<"), but the distinction is important for discrete distributions. The proper use of tables of thebinomial andPoisson distributions depends upon this convention. Moreover, important formulas likePaul Lévy's inversion formula for thecharacteristic function also rely on the "less than or equal" formulation.

If treating several random variablesX,Y,{\displaystyle X,Y,\ldots } etc. the corresponding letters are used as subscripts while, if treating only one, the subscript is usually omitted. It is conventional to use a capitalF{\displaystyle F} for a cumulative distribution function, in contrast to the lower-casef{\displaystyle f} used forprobability density functions andprobability mass functions. This applies when discussing general distributions: some specific distributions have their own conventional notation, for example thenormal distribution usesΦ{\displaystyle \Phi } andϕ{\displaystyle \phi } instead ofF{\displaystyle F} andf{\displaystyle f}, respectively.

The probability density function of a continuous random variable can be determined from the cumulative distribution function by differentiating[3] using theFundamental Theorem of Calculus; i.e. givenF(x){\displaystyle F(x)},f(x)=dF(x)dx{\displaystyle f(x)={\frac {dF(x)}{dx}}}as long as the derivative exists.

The CDF of acontinuous random variableX{\displaystyle X} can be expressed as the integral of its probability density functionfX{\displaystyle f_{X}} as follows:[2]: 86 FX(x)=xfX(t)dt.{\displaystyle F_{X}(x)=\int _{-\infty }^{x}f_{X}(t)\,dt.}

In the case of a random variableX{\displaystyle X} which has distribution having a discrete component at a valueb{\displaystyle b},P(X=b)=FX(b)limxbFX(x).{\displaystyle \operatorname {P} (X=b)=F_{X}(b)-\lim _{x\to b^{-}}F_{X}(x).}

IfFX{\displaystyle F_{X}} is continuous atb{\displaystyle b}, this equals zero and there is no discrete component atb{\displaystyle b}.

Properties

[edit]
From top to bottom, the cumulative distribution function of a discrete probability distribution, continuous probability distribution, and a distribution which has both a continuous part and a discrete part.
Example of a cumulative distribution function with a countably infinite set of discontinuities.

Every cumulative distribution functionFX{\displaystyle F_{X}} isnon-decreasing[2]: 78  andright-continuous,[2]: 79  which makes it acàdlàg function. Furthermore,limxFX(x)=0,limx+FX(x)=1.{\displaystyle \lim _{x\to -\infty }F_{X}(x)=0,\quad \lim _{x\to +\infty }F_{X}(x)=1.}

Every function with these three properties is a CDF, i.e., for every such function, arandom variable can be defined such that the function is the cumulative distribution function of that random variable.

IfX{\displaystyle X} is a purelydiscrete random variable, then it attains valuesx1,x2,{\displaystyle x_{1},x_{2},\ldots } with probabilitypi=p(xi){\displaystyle p_{i}=p(x_{i})}, and the CDF ofX{\displaystyle X} will bediscontinuous at the pointsxi{\displaystyle x_{i}}:FX(x)=P(Xx)=xixP(X=xi)=xixp(xi).{\displaystyle F_{X}(x)=\operatorname {P} (X\leq x)=\sum _{x_{i}\leq x}\operatorname {P} (X=x_{i})=\sum _{x_{i}\leq x}p(x_{i}).}

If the CDFFX{\displaystyle F_{X}} of a real valued random variableX{\displaystyle X} iscontinuous, thenX{\displaystyle X} is acontinuous random variable; if furthermoreFX{\displaystyle F_{X}} isabsolutely continuous, then there exists aLebesgue-integrable functionfX(x){\displaystyle f_{X}(x)} such thatFX(b)FX(a)=P(a<Xb)=abfX(x)dx{\displaystyle F_{X}(b)-F_{X}(a)=\operatorname {P} (a<X\leq b)=\int _{a}^{b}f_{X}(x)\,dx}for all real numbersa{\displaystyle a} andb{\displaystyle b}. The functionfX{\displaystyle f_{X}} is equal to thederivative ofFX{\displaystyle F_{X}}almost everywhere, and it is called theprobability density function of the distribution ofX{\displaystyle X}.

IfX{\displaystyle X} has finiteL1-norm, that is, the expectation of|X|{\displaystyle |X|} is finite, then the expectation is given by theRiemann–Stieltjes integralE[X]=tdFX(t){\displaystyle \mathbb {E} [X]=\int _{-\infty }^{\infty }t\,dF_{X}(t)}

CDF plot with two red rectangles, illustrating two inequalities

and for anyx0{\displaystyle x\geq 0},x(1FX(x))xtdFX(t){\displaystyle x(1-F_{X}(x))\leq \int _{x}^{\infty }t\,dF_{X}(t)}as well asxFX(x)x(t)dFX(t){\displaystyle xF_{X}(-x)\leq \int _{-\infty }^{-x}(-t)\,dF_{X}(t)}as shown in the diagram (consider the areas of the two red rectangles and their extensions to the right or left up to the graph ofFX{\displaystyle F_{X}}).[clarification needed] In particular, we havelimxxFX(x)=0,limx+x(1FX(x))=0.{\displaystyle \lim _{x\to -\infty }xF_{X}(x)=0,\quad \lim _{x\to +\infty }x(1-F_{X}(x))=0.}In addition, the (finite) expected value of the real-valued random variableX{\displaystyle X} can be defined on the graph of its cumulative distribution function as illustrated by thedrawing in thedefinition of expected value for arbitrary real-valued random variables.

Examples

[edit]

As an example, supposeX{\displaystyle X} isuniformly distributed on the unit interval[0,1]{\displaystyle [0,1]}.

Then the CDF ofX{\displaystyle X} is given byFX(x)={0: x<0x: 0x11: x>1{\displaystyle F_{X}(x)={\begin{cases}0&:\ x<0\\x&:\ 0\leq x\leq 1\\1&:\ x>1\end{cases}}}

Suppose instead thatX{\displaystyle X} takes only the discrete values 0 and 1, with equal probability.

Then the CDF ofX{\displaystyle X} is given byFX(x)={0: x<01/2: 0x<11: x1{\displaystyle F_{X}(x)={\begin{cases}0&:\ x<0\\1/2&:\ 0\leq x<1\\1&:\ x\geq 1\end{cases}}}

SupposeX{\displaystyle X} isexponential distributed. Then the CDF ofX{\displaystyle X} is given byFX(x;λ)={1eλxx0,0x<0.{\displaystyle F_{X}(x;\lambda )={\begin{cases}1-e^{-\lambda x}&x\geq 0,\\0&x<0.\end{cases}}}

Hereλ > 0 is the parameter of the distribution, often called the rate parameter.

SupposeX{\displaystyle X} isnormal distributed. Then the CDF ofX{\displaystyle X} is given byF(t;μ,σ)=1σ2πtexp((xμ)22σ2)dx.{\displaystyle F(t;\mu ,\sigma )={\frac {1}{\sigma {\sqrt {2\pi }}}}\int _{-\infty }^{t}\exp \left(-{\frac {(x-\mu )^{2}}{2\sigma ^{2}}}\right)\,dx.}

Here the parameterμ{\displaystyle \mu } is themean or expectation of the distribution; andσ{\displaystyle \sigma } is its standard deviation.

A table of the CDF of the standard normal distribution is often used in statistical applications, where it is named thestandard normal table, theunit normal table, or theZ table.

SupposeX{\displaystyle X} isbinomial distributed. Then the CDF ofX{\displaystyle X} is given byF(k;n,p)=Pr(Xk)=i=0k(ni)pi(1p)ni{\displaystyle F(k;n,p)=\Pr(X\leq k)=\sum _{i=0}^{\lfloor k\rfloor }{n \choose i}p^{i}(1-p)^{n-i}}

Herep{\displaystyle p} is the probability of success and the function denotes the discrete probability distribution of the number of successes in a sequence ofn{\displaystyle n} independent experiments, andk{\displaystyle \lfloor k\rfloor } is the "floor" underk{\displaystyle k}, i.e. the greatest integer less than or equal tok{\displaystyle k}.

Derived functions

[edit]

Complementary cumulative distribution function (tail distribution)

[edit]

Sometimes, it is useful to study the opposite question and ask how often the random variable isabove a particular level. This is called thecomplementary cumulative distribution function (ccdf) or simply thetail distribution orexceedance, and is defined asF¯X(x)=P(X>x)=1FX(x).{\displaystyle {\bar {F}}_{X}(x)=\operatorname {P} (X>x)=1-F_{X}(x).}

This has applications instatisticalhypothesis testing, for example, because the one-sidedp-value is the probability of observing a test statisticat least as extreme as the one observed. Thus, provided that thetest statistic,T, has a continuous distribution, the one-sidedp-value is simply given by the ccdf: for an observed valuet{\displaystyle t} of the test statisticp=P(Tt)=P(T>t)=1FT(t).{\displaystyle p=\operatorname {P} (T\geq t)=\operatorname {P} (T>t)=1-F_{T}(t).}

Insurvival analysis,F¯X(x){\displaystyle {\bar {F}}_{X}(x)} is called thesurvival function and denotedS(x){\displaystyle S(x)}, while the termreliability function is common inengineering.

Properties

Folded cumulative distribution

[edit]
Example of the folded cumulative distribution for anormal distribution function with anexpected value of 0 and astandard deviation of 1.

While the plot of a cumulative distributionF{\displaystyle F} often has an S-like shape, an alternative illustration is thefolded cumulative distribution ormountain plot, which folds the top half of the graph over,[5][6] that is

Ffold(x)=F(x)1{F(x)0.5}+(1F(x))1{F(x)>0.5}{\displaystyle F_{\text{fold}}(x)=F(x)1_{\{F(x)\leq 0.5\}}+(1-F(x))1_{\{F(x)>0.5\}}}

where1{A}{\displaystyle 1_{\{A\}}} denotes theindicator function and the second summand is thesurvivor function, thus using two scales, one for the upslope and another for the downslope. This form of illustration emphasises themedian,dispersion (specifically, themean absolute deviation from the median[7]) andskewness of the distribution or of the empirical results.

Inverse distribution function (quantile function)

[edit]
Main article:Quantile function

If the CDFF is strictly increasing and continuous thenF1(p),p[0,1],{\displaystyle F^{-1}(p),p\in [0,1],} is the unique real numberx{\displaystyle x} such thatF(x)=p{\displaystyle F(x)=p}. This defines theinverse distribution function orquantile function.

Some distributions do not have a unique inverse (for example iffX(x)=0{\displaystyle f_{X}(x)=0} for alla<x<b{\displaystyle a<x<b}, causingFX{\displaystyle F_{X}} to be constant). In this case, one may use thegeneralized inverse distribution function, which is defined as

F1(p)=inf{xR:F(x)p},p[0,1].{\displaystyle F^{-1}(p)=\inf\{x\in \mathbb {R} :F(x)\geq p\},\quad \forall p\in [0,1].}

Some useful properties of the inverse cdf (which are also preserved in the definition of the generalizedinverse distribution function) are:

  1. F1{\displaystyle F^{-1}} is nondecreasing[8]
  2. F1(F(x))x{\displaystyle F^{-1}(F(x))\leq x}
  3. F(F1(p))p{\displaystyle F(F^{-1}(p))\geq p}
  4. F1(p)x{\displaystyle F^{-1}(p)\leq x} if and only ifpF(x){\displaystyle p\leq F(x)}
  5. IfY{\displaystyle Y} has aU[0,1]{\displaystyle U[0,1]} distribution thenF1(Y){\displaystyle F^{-1}(Y)} is distributed asF{\displaystyle F}. This is used inrandom number generation using theinverse transform sampling-method.
  6. If{Xα}{\displaystyle \{X_{\alpha }\}} is a collection of independentF{\displaystyle F}-distributed random variables defined on the samesample space, then there exist random variablesYα{\displaystyle Y_{\alpha }} such thatYα{\displaystyle Y_{\alpha }} is distributed asU[0,1]{\displaystyle U[0,1]} andF1(Yα)=Xα{\displaystyle F^{-1}(Y_{\alpha })=X_{\alpha }} with probability 1 for allα{\displaystyle \alpha }.[citation needed]

The inverse of the cdf can be used to translate results obtained for the uniform distribution to other distributions.

Empirical distribution function

[edit]

Theempirical distribution function is an estimate of the cumulative distribution function that generated the points in the sample. It converges with probability 1 to that underlying distribution. A number of results exist to quantify therate of convergence of the empirical distribution function to the underlying cumulative distribution function.[9]

Multivariate case

[edit]

Definition for two random variables

[edit]

When dealing simultaneously with more than one random variable thejoint cumulative distribution function can also be defined. For example, for a pair of random variablesX,Y{\displaystyle X,Y}, the joint CDFFXY{\displaystyle F_{XY}} is given by[2]: 89 

FX,Y(x,y)=P(Xx,Yy){\displaystyle F_{X,Y}(x,y)=\operatorname {P} (X\leq x,Y\leq y)}   (Eq.3)

where the right-hand side represents theprobability that the random variableX{\displaystyle X} takes on a value less than or equal tox{\displaystyle x}and thatY{\displaystyle Y} takes on a value less than or equal toy{\displaystyle y}.

Example of joint cumulative distribution function:

For two continuous variablesX andY:Pr(a<X<b and c<Y<d)=abcdf(x,y)dydx;{\displaystyle \Pr(a<X<b{\text{ and }}c<Y<d)=\int _{a}^{b}\int _{c}^{d}f(x,y)\,dy\,dx;}

For two discrete random variables, it is beneficial to generate a table of probabilities and address the cumulative probability for each potential range ofX andY, and here is the example:[10]

given the joint probability mass function in tabular form, determine the joint cumulative distribution function.

Y = 2Y = 4Y = 6Y = 8
X = 100.100.1
X = 3000.20
X = 50.3000.15
X = 7000.150

Solution: using the given table of probabilities for each potential range ofX andY, the joint cumulative distribution function may be constructed in tabular form:

Y < 2Y ≤ 2Y ≤ 4Y ≤ 6Y ≤ 8
X < 100000
X ≤ 1000.10.10.2
X ≤ 3000.10.30.4
X ≤ 500.30.40.60.85
X ≤ 700.30.40.751

Definition for more than two random variables

[edit]

ForN{\displaystyle N} random variablesX1,,XN{\displaystyle X_{1},\ldots ,X_{N}}, the joint CDFFX1,,XN{\displaystyle F_{X_{1},\ldots ,X_{N}}} is given by

FX1,,XN(x1,,xN)=P(X1x1,,XNxN){\displaystyle F_{X_{1},\ldots ,X_{N}}(x_{1},\ldots ,x_{N})=\operatorname {P} (X_{1}\leq x_{1},\ldots ,X_{N}\leq x_{N})}   (Eq.4)

Interpreting theN{\displaystyle N} random variables as arandom vectorX=(X1,,XN)T{\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{N})^{T}} yields a shorter notation:FX(x)=P(X1x1,,XNxN){\displaystyle F_{\mathbf {X} }(\mathbf {x} )=\operatorname {P} (X_{1}\leq x_{1},\ldots ,X_{N}\leq x_{N})}

Properties

[edit]

Every multivariate CDF is:

  1. Monotonically non-decreasing for each of its variables,
  2. Right-continuous in each of its variables,
  3. 0FX1Xn(x1,,xn)1,{\displaystyle 0\leq F_{X_{1}\ldots X_{n}}(x_{1},\ldots ,x_{n})\leq 1,}
  4. limx1,,xn+FX1Xn(x1,,xn)=1{\displaystyle \lim _{x_{1},\ldots ,x_{n}\to +\infty }F_{X_{1}\ldots X_{n}}(x_{1},\ldots ,x_{n})=1} andlimxiFX1Xn(x1,,xn)=0,{\displaystyle \lim _{x_{i}\to -\infty }F_{X_{1}\ldots X_{n}}(x_{1},\ldots ,x_{n})=0,} for alli.

Not every function satisfying the above four properties is a multivariate CDF, unlike in the single dimension case. For example, letF(x,y)=0{\displaystyle F(x,y)=0} forx<0{\displaystyle x<0} orx+y<1{\displaystyle x+y<1} ory<0{\displaystyle y<0} and letF(x,y)=1{\displaystyle F(x,y)=1} otherwise. It is easy to see that the above conditions are met, and yetF{\displaystyle F} is not a CDF since if it was, thenP(13<X1,13<Y1)=1{\textstyle \operatorname {P} \left({\frac {1}{3}}<X\leq 1,{\frac {1}{3}}<Y\leq 1\right)=-1} as explained below.

The probability that a point belongs to ahyperrectangle is analogous to the 1-dimensional case:[11]FX1,X2(a,c)+FX1,X2(b,d)FX1,X2(a,d)FX1,X2(b,c)=P(a<X1b,c<X2d)={\displaystyle F_{X_{1},X_{2}}(a,c)+F_{X_{1},X_{2}}(b,d)-F_{X_{1},X_{2}}(a,d)-F_{X_{1},X_{2}}(b,c)=\operatorname {P} (a<X_{1}\leq b,c<X_{2}\leq d)=\int \cdots }

Complex case

[edit]

Complex random variable

[edit]

The generalization of the cumulative distribution function from real tocomplex random variables is not obvious because expressions of the formP(Z1+2i){\displaystyle P(Z\leq 1+2i)} make no sense. However expressions of the formP((Z)1,(Z)3){\displaystyle P(\Re {(Z)}\leq 1,\Im {(Z)}\leq 3)} make sense. Therefore, we define the cumulative distribution of a complex random variables via thejoint distribution of their real and imaginary parts:FZ(z)=F(Z),(Z)((z),(z))=P((Z)(z),(Z)(z)).{\displaystyle F_{Z}(z)=F_{\Re {(Z)},\Im {(Z)}}(\Re {(z)},\Im {(z)})=P(\Re {(Z)}\leq \Re {(z)},\Im {(Z)}\leq \Im {(z)}).}

Complex random vector

[edit]

Generalization ofEq.4 yieldsFZ(z)=F(Z1),(Z1),,(Zn),(Zn)((z1),(z1),,(zn),(zn))=P((Z1)(z1),(Z1)(z1),,(Zn)(zn),(Zn)(zn)){\displaystyle {\begin{aligned}F_{\mathbf {Z} }(\mathbf {z} )&=F_{\Re {(Z_{1})},\Im {(Z_{1})},\ldots ,\Re {(Z_{n})},\Im {(Z_{n})}}(\Re {(z_{1})},\Im {(z_{1})},\ldots ,\Re {(z_{n})},\Im {(z_{n})})\\[1ex]&=\operatorname {P} (\Re {(Z_{1})}\leq \Re {(z_{1})},\Im {(Z_{1})}\leq \Im {(z_{1})},\ldots ,\Re {(Z_{n})}\leq \Re {(z_{n})},\Im {(Z_{n})}\leq \Im {(z_{n})})\end{aligned}}}as definition for the CDS of a complex random vectorZ=(Z1,,ZN)T{\displaystyle \mathbf {Z} =(Z_{1},\ldots ,Z_{N})^{T}}.

Use in statistical analysis

[edit]

The concept of the cumulative distribution function makes an explicit appearance in statistical analysis in two (similar) ways.Cumulative frequency analysis is the analysis of the frequency of occurrence of values of a phenomenon less than a reference value. Theempirical distribution function is a formal direct estimate of the cumulative distribution function for which simple statistical properties can be derived and which can form the basis of variousstatistical hypothesis tests. Such tests can assess whether there is evidence against a sample of data having arisen from a given distribution, or evidence against two samples of data having arisen from the same (unknown) population distribution.

Kolmogorov–Smirnov and Kuiper's tests

[edit]

TheKolmogorov–Smirnov test is based on cumulative distribution functions and can be used to test to see whether two empirical distributions are different or whether an empirical distribution is different from an ideal distribution. The closely relatedKuiper's test is useful if the domain of the distribution is cyclic as in day of the week. For instance Kuiper's test might be used to see if the number of tornadoes varies during the year or if sales of a product vary by day of the week or day of the month.

See also

[edit]

References

[edit]
  1. ^Deisenroth, Marc Peter; Faisal, A. Aldo; Ong, Cheng Soon (2020).Mathematics for Machine Learning. Cambridge University Press. p. 181.ISBN 9781108455145.
  2. ^abcdefPark, Kun Il (2018).Fundamentals of Probability and Stochastic Processes with Applications to Communications. Springer.ISBN 978-3-319-68074-3.
  3. ^Montgomery, Douglas C.; Runger, George C. (2003).Applied Statistics and Probability for Engineers(PDF). John Wiley & Sons, Inc. p. 104.ISBN 0-471-20454-4.Archived(PDF) from the original on 2012-07-30.
  4. ^Zwillinger, Daniel; Kokoska, Stephen (2010).CRC Standard Probability and Statistics Tables and Formulae. CRC Press. p. 49.ISBN 978-1-58488-059-2.
  5. ^Gentle, J.E. (2009).Computational Statistics.Springer.ISBN 978-0-387-98145-1. Retrieved2010-08-06.[page needed]
  6. ^Monti, K. L. (1995). "Folded Empirical Distribution Function Curves (Mountain Plots)".The American Statistician.49 (4):342–345.doi:10.2307/2684570.JSTOR 2684570.
  7. ^Xue, J. H.; Titterington, D. M. (2011)."The p-folded cumulative distribution function and the mean absolute deviation from the p-quantile"(PDF).Statistics & Probability Letters.81 (8):1179–1182.doi:10.1016/j.spl.2011.03.014.
  8. ^Chan, Stanley H. (2021).Introduction to Probability for Data Science. Michigan Publishing. p. 18.ISBN 978-1-60785-746-4.
  9. ^Hesse, C. (1990). "Rates of convergence for the empirical distribution function and the empirical characteristic function of a broad class of linear processes".Journal of Multivariate Analysis.35 (2):186–202.doi:10.1016/0047-259X(90)90024-C.
  10. ^"Joint Cumulative Distribution Function (CDF)".math.info. Retrieved2019-12-11.
  11. ^"Archived copy"(PDF).www.math.wustl.edu. Archived fromthe original(PDF) on 22 February 2016. Retrieved13 January 2022.{{cite web}}: CS1 maint: archived copy as title (link)

External links

[edit]
International
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=Cumulative_distribution_function&oldid=1325886584"
Category:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp