Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Kolmogorov–Smirnov test

From Wikipedia, the free encyclopedia
Statistical test comparing two probability distributions

Illustration of the Kolmogorov–Smirnov statistic. The red line is a modelCDF, the blue line is an empirical CDF, and length of the black arrow is the KS statistic.

Instatistics, theKolmogorov–Smirnov test (alsoK–S test orKS test) is anonparametric test of the equality of continuous (or discontinuous, seeSection 2.2), one-dimensionalprobability distributions. It can be used to test whether asample came from a given reference probability distribution (one-sample K–S test), or to test whether or not two samples came from the same distribution (two-sample K–S test). It is named afterAndrey Kolmogorov andNikolai Smirnov, who developed it in the 1930s.[1]

The Kolmogorov–Smirnov statistic quantifies adistance between theempirical distribution function of the sample and thecumulative distribution function of the reference distribution, or between the empirical distribution functions of two samples. Thenull distribution of this statistic is calculated under thenull hypothesis that the sample is drawn from the reference distribution (in the one-sample case) or that the samples are drawn from the same distribution (in the two-sample case). In the one-sample case, the distribution considered under the null hypothesis may be continuous (seeSection 2), purely discrete or mixed (seeSection 2.2). In the two-sample case (seeSection 3), the distribution considered under the null hypothesis is a continuous distribution but is otherwise unrestricted.

The two-sample K–S test is one of the most useful and generalnonparametric methods for comparing two samples, as it is sensitive to differences in both location and shape of the empirical cumulative distribution functions of the two samples.

The Kolmogorov–Smirnov test can be modified to serve as agoodness of fit test. In the special case of testing fornormality of the distribution, samples are standardized and compared with a standard normal distribution. This is equivalent to setting the mean and variance of the reference distribution equal to the sample estimates, and it is known that using these to define the specific reference distribution changes the null distribution of the test statistic (seeTest with estimated parameters). Various studies have found that, even in this corrected form, the test is lesspowerful for testing normality than theShapiro–Wilk test orAnderson–Darling test.[2] However, these other tests have their own disadvantages. For instance, the Shapiro–Wilk test is known not to work well in samples with many identical values.

One-sample Kolmogorov–Smirnov statistic

[edit]

Theempirical distribution functionFn fornindependent and identically distributed (i.i.d.) ordered observationsXi is defined as

Fn(x)=number of (elements in the samplex)n=1ni=1n1(,x](Xi),{\displaystyle F_{n}(x)={\frac {{\text{number of (elements in the sample}}\leq x)}{n}}={\frac {1}{n}}\sum _{i=1}^{n}1_{(-\infty ,x]}(X_{i}),}where1(,x](Xi){\displaystyle 1_{(-\infty ,x]}(X_{i})} is theindicator function, equal to 1 ifXix{\displaystyle X_{i}\leq x} and equal to 0 otherwise.

The Kolmogorov–Smirnovstatistic for a givencumulative distribution functionF(x) is

Dn=supx|Fn(x)F(x)|{\displaystyle D_{n}=\sup _{x}|F_{n}(x)-F(x)|}

where supx is thesupremum of the set of distances. Intuitively, the statistic takes the largest absolute difference between the two distribution functions across allx values.

By theGlivenko–Cantelli theorem, if the sample comes from the distributionF(x), thenDn converges to 0almost surely in the limit whenn{\displaystyle n} goes to infinity. Kolmogorov strengthened this result, by effectively providing the rate of this convergence (seeKolmogorov distribution).Donsker's theorem provides a yet stronger result.

In practice, the statistic requires a relatively large number of data points (in comparison to other goodness of fit criteria such as theAnderson–Darling test statistic) to properly reject the null hypothesis.

Kolmogorov distribution

[edit]
Illustration of the Kolmogorov distribution'sPDF

The Kolmogorov distribution is the distribution of therandom variable

K=supt[0,1]|B(t)|{\displaystyle K=\sup _{t\in [0,1]}|B(t)|}

whereB(t) is theBrownian bridge. Thecumulative distribution function ofK is given by[3]

Pr(Kx)=12k=1(1)k1e2k2x2=2πxk=1e(2k1)2π2/(8x2),{\displaystyle {\begin{aligned}\operatorname {Pr} (K\leq x)&=1-2\sum _{k=1}^{\infty }(-1)^{k-1}e^{-2k^{2}x^{2}}\\&={\frac {\sqrt {2\pi }}{x}}\sum _{k=1}^{\infty }e^{-(2k-1)^{2}\pi ^{2}/(8x^{2})},\end{aligned}}}

which can also be expressed by theJacobi theta functionϑ01(z=0;τ=2ix2/π){\displaystyle \vartheta _{01}(z=0;\tau =2ix^{2}/\pi )}. Both the form of the Kolmogorov–Smirnov test statistic and its asymptotic distribution under the null hypothesis were published byAndrey Kolmogorov,[4] while a table of the distribution was published byNikolai Smirnov.[5] Recurrence relations for the distribution of the test statistic in finite samples are available.[4]

Under null hypothesis that the sample comes from the hypothesized distributionF(x),

nDnnsupt|B(F(t))|{\displaystyle {\sqrt {n}}D_{n}{\xrightarrow {n\to \infty }}\sup _{t}|B(F(t))|}

in distribution, whereB(t) is the Brownian bridge. IfF is continuous then under the null hypothesisnDn{\displaystyle {\sqrt {n}}D_{n}} converges to the Kolmogorov distribution, which does not depend onF. This result may also be known as the Kolmogorov theorem.

The accuracy of this limit as an approximation to the exact CDF ofK{\displaystyle K} whenn{\displaystyle n} is finite is not very impressive: even whenn=1000{\displaystyle n=1000}, the corresponding maximum error is about0.9 %{\displaystyle 0.9~\%}; this error increases to2.6 %{\displaystyle 2.6~\%} whenn=100{\displaystyle n=100} and to a totally unacceptable7 %{\displaystyle 7~\%} whenn=10{\displaystyle n=10}. However, a very simple expedient of replacingx{\displaystyle x} byx+16n+x14n{\displaystyle x+{\frac {1}{6{\sqrt {n}}}}+{\frac {x-1}{4n}}}

in the argument of the Jacobi theta function reduces these errors to0.003 %{\displaystyle 0.003~\%},0.027%{\displaystyle 0.027\%}, and0.27 %{\displaystyle 0.27~\%} respectively; such accuracy would be usually considered more than adequate for all practical applications.[6]

Thegoodness-of-fit test or the Kolmogorov–Smirnov test can be constructed by using the critical values of the Kolmogorov distribution. This test is asymptotically valid whenn.{\displaystyle n\to \infty .} It rejects the null hypothesis at levelα{\displaystyle \alpha } if

nDn>Kα,{\displaystyle {\sqrt {n}}D_{n}>K_{\alpha },\,}

whereKα is found from

Pr(KKα)=1α.{\displaystyle \operatorname {Pr} (K\leq K_{\alpha })=1-\alpha .\,}

The asymptoticpower of this test is 1.

Fast and accurate algorithms to compute the cdfPr(Dnx){\displaystyle \operatorname {Pr} (D_{n}\leq x)} or its complement for arbitraryn{\displaystyle n} andx{\displaystyle x}, are available from:

  • [7] and[8] for continuous null distributions with code in C and Java to be found in.[7]
  • [9] for purely discrete, mixed or continuous null distribution implemented in the KSgeneral package[10] of theR project for statistical computing, which for a given sample also computes the KS test statistic and its p-value. Alternative C++ implementation is available from.[9]

Test with estimated parameters

[edit]

If either the form or the parameters ofF(x) are determined from the dataXi the critical values determined in this way are invalid. In such cases,Monte Carlo or other methods may be required, but tables have been prepared for some cases. Details for the required modifications to the test statistic and for the critical values for thenormal distribution and theexponential distribution have been published,[11] and later publications also include theGumbel distribution.[12] TheLilliefors test represents a special case of this for the normal distribution. The logarithm transformation may help to overcome cases where the Kolmogorov test data does not seem to fit the assumption that it came from the normal distribution.

Using estimated parameters, the question arises which estimation method should be used. Usually this would be themaximum likelihood method, but e.g. for the normal distribution MLE has a large bias error on sigma. Using a moment fit or KS minimization instead has a large impact on the critical values, and also some impact on test power. If we need to decide for Student-T data with df = 2 via KS test whether the data could be normal or not, then a ML estimate based on H0 (data is normal, so using the standard deviation for scale) would give much larger KS distance, than a fit with minimum KS. In this case we should reject H0, which is often the case with MLE, because the sample standard deviation might be very large for T-2 data, but with KS minimization we may get still a too low KS to reject H0. In the Student-T case, a modified KS test with KS estimate instead of MLE, makes the KS test indeed slightly worse. However, in other cases, such a modified KS test leads to slightly better test power.[citation needed]

Discrete and mixed null distribution

[edit]

Under the assumption thatF{\displaystyle F} is non-decreasing and right-continuous, with countable (possibly infinite) number of jumps, the KS test statistic can be expressed as:

Dn=supx|Fn(x)F(x)|=sup0t1|Fn(F1(t))F(F1(t))|.{\displaystyle D_{n}=\sup _{x}|F_{n}(x)-F(x)|=\sup _{0\leq t\leq 1}|F_{n}(F^{-1}(t))-F(F^{-1}(t))|.}

From the right-continuity ofF{\displaystyle F}, it follows thatF(F1(t))t{\displaystyle F(F^{-1}(t))\geq t} andF1(F(x))x{\displaystyle F^{-1}(F(x))\leq x} and hence, the distribution ofDn{\displaystyle D_{n}} depends on the null distributionF{\displaystyle F}, i.e., is no longer distribution-free as in the continuous case. Therefore, a fast and accurate method has been developed to compute the exact and asymptotic distribution ofDn{\displaystyle D_{n}} whenF{\displaystyle F} is purely discrete or mixed,[9] implemented in C++ and in the KSgeneral package[10] of theR language. The functionsdisc_ks_test(),mixed_ks_test() andcont_ks_test() compute also the KS test statistic and p-values for purely discrete, mixed or continuous null distributions and arbitrary sample sizes. The KS test and its p-values for discrete null distributions and small sample sizes are also computed in[13] as part of the dgof package of the R language. Major statistical packages among whichSASPROC NPAR1WAY,[14]Stataksmirnov[15] implement the KS test under the assumption thatF(x){\displaystyle F(x)} is continuous, which is more conservative if the null distribution is actually not continuous (see[16][17][18]).

Two-sample Kolmogorov–Smirnov test

[edit]
Illustration of the two-sample Kolmogorov–Smirnov statistic. Red and blue lines each correspond to an empirical distribution function, and the black arrow is the two-sample KS statistic.

The Kolmogorov–Smirnov test may also be used to test whether two underlying one-dimensional probability distributions differ. In this case, the Kolmogorov–Smirnov statistic is

Dn,m=supx|F1,n(x)F2,m(x)|,{\displaystyle D_{n,m}=\sup _{x}|F_{1,n}(x)-F_{2,m}(x)|,}

whereF1,n{\displaystyle F_{1,n}} andF2,m{\displaystyle F_{2,m}} are theempirical distribution functions of the first and the second sample respectively, andsup{\displaystyle \sup } is thesupremum function.

For large samples, the null hypothesis is rejected at levelα{\displaystyle \alpha } if

Dn,m>c(α)n+mnm.{\displaystyle D_{n,m}>c(\alpha ){\sqrt {\frac {n+m}{n\cdot m}}}.}

Wheren{\displaystyle n} andm{\displaystyle m} are the sizes of first and second sample respectively. The value ofc(α){\displaystyle c({\alpha })} is given in the table below for the most common levels ofα{\displaystyle \alpha }

α{\displaystyle \alpha }0.200.150.100.050.0250.010.0050.001
c(α){\displaystyle c({\alpha })}1.0731.1381.2241.3581.481.6281.7311.949

and in general[19] by

c(α)=ln(α2)12,{\displaystyle c\left(\alpha \right)={\sqrt {-\ln \left({\tfrac {\alpha }{2}}\right)\cdot {\tfrac {1}{2}}}},}

so that the condition reads

Dn,m>ln(α2)1+mn2m.{\displaystyle D_{n,m}>{\sqrt {-\ln \left({\tfrac {\alpha }{2}}\right)\cdot {\tfrac {1+{\tfrac {m}{n}}}{2m}}}}.}

Here, again, the larger the sample sizes, the more sensitive the minimal bound: For a given ratio of sample sizes (e.g.m=n{\displaystyle m=n}), the minimal bound scales in the size of either of the samples according to its inverse square root.

Note that the two-sample test checks whether the two data samples come from the same distribution. This does not specify what that common distribution is (e.g. whether it's normal or not normal). Again, tables of critical values have been published. A shortcoming of the univariate Kolmogorov–Smirnov test is that it is not very powerful because it is devised to be sensitive against all possible types of differences between two distribution functions. Some argue[20][21] that theCucconi test, originally proposed for simultaneously comparing location and scale, can be much more powerful than the Kolmogorov–Smirnov test when comparing two distribution functions.

Two-sample KS tests have been applied in economics to detect asymmetric effects and to study natural experiments.[22]

Setting confidence limits for the shape of a distribution function

[edit]
Main article:Dvoretzky–Kiefer–Wolfowitz inequality

While the Kolmogorov–Smirnov test is usually used to test whether a givenF(x) is the underlying probability distribution ofFn(x), the procedure may be inverted to give confidence limits onF(x) itself. If one chooses a critical value of the test statisticDα such that P(Dn > Dα) =α, then a band of width ±Dα aroundFn(x) will entirely containF(x) with probability 1 − α.

The Kolmogorov–Smirnov statistic in more than one dimension

[edit]

A distribution-free multivariate Kolmogorov–Smirnov goodness of fit test has been proposed byJustel, Peña and Zamar (1997).[23] The test uses a statistic which is built using Rosenblatt's transformation, and an algorithm is developed to compute it in the bivariate case. An approximate test that can be easily computed in any dimension is also presented.

The Kolmogorov–Smirnov test statistic needs to be modified if a similar test is to be applied tomultivariate data. This is not straightforward because the maximum difference between two jointcumulative distribution functions is not generally the same as the maximum difference of any of the complementary distribution functions. Thus the maximum difference will differ depending on which ofPr(X<xY<y){\displaystyle \Pr(X<x\land Y<y)} orPr(X<xY>y){\displaystyle \Pr(X<x\land Y>y)} or any of the other two possible arrangements is used. One might require that the result of the test used should not depend on which choice is made.

One approach to generalizing the Kolmogorov–Smirnov statistic to higher dimensions which meets the above concern is to compare the cdfs of the two samples with all possible orderings, and take the largest of the set of resulting KS statistics. Ind dimensions, there are 2d − 1 such orderings. One such variation is due to Peacock[24] (see also Gosset[25] for a 3D version)and another to Fasano and Franceschini[26] (see Lopes et al. for a comparison and computational details).[27] Critical values for the test statistic can be obtained by simulations, but depend on the dependence structure in the joint distribution.

Implementations

[edit]

The Kolmogorov–Smirnov test is implemented in many software programs. Most of these implement both the one and two sampled test.

  • Mathematica hasKolmogorovSmirnovTest.
  • MATLAB's Statistics Toolbox haskstest andkstest2 for one-sample and two-sample Kolmogorov–Smirnov tests, respectively.
  • TheR package "KSgeneral"[10] computes the KS test statistics and its p-values under arbitrary, possibly discrete, mixed or continuous null distribution.
  • R's statistics base-package implements the test asks.test {stats} in its "stats" package.
  • SAS implements the test in its PROC NPAR1WAY procedure.
  • InPython, theSciPy package implements the test in the scipy.stats.kstest function.[28]
  • SYSTAT (SPSS Inc., Chicago, IL)
  • Java has an implementation of this test provided byApache Commons.[29]
  • KNIME has a node implementing this test based on the above Java implementation.[30]
  • Julia has the packageHypothesisTests.jl with the function ExactOneSampleKSTest(x::AbstractVector{<:Real}, d::UnivariateDistribution).[31]
  • StatsDirect (StatsDirect Ltd, Manchester, UK) implementsall common variants.
  • Stata (Stata Corporation, College Station, TX) implements the test in ksmirnov (Kolmogorov–Smirnov equality-of-distributions test) command.[32]
  • PSPP implements the test in itsKOLMOGOROV-SMIRNOV (or using KS shortcut function).
  • The Real Statistics Resource Pack forExcel runs the test as KSCRIT and KSPROB.[33]
  • ClickHouse implements the test in itskolmogorovSmirnovTest function.

See also

[edit]

References

[edit]
  1. ^"7.2.1.2. Kolmogorov- Smirnov test". Retrieved8 October 2025.
  2. ^Stephens, M. A. (1974). "EDF Statistics for Goodness of Fit and Some Comparisons".Journal of the American Statistical Association.69 (347):730–737.doi:10.2307/2286009.JSTOR 2286009.
  3. ^Marsaglia G, Tsang WW, Wang J (2003)."Evaluating Kolmogorov's Distribution".Journal of Statistical Software.8 (18):1–4.doi:10.18637/jss.v008.i18.
  4. ^abKolmogorov A (1933). "Sulla determinazione empirica di una legge di distribuzione".G. Ist. Ital. Attuari.4:83–91.
  5. ^Smirnov N (1948)."Table for estimating the goodness of fit of empirical distributions".Annals of Mathematical Statistics.19 (2):279–281.doi:10.1214/aoms/1177730256.
  6. ^Vrbik, Jan (2018). "Small-Sample Corrections to Kolmogorov–Smirnov Test Statistic".Pioneer Journal of Theoretical and Applied Statistics.15 (1–2):15–23.
  7. ^abSimard R, L'Ecuyer P (2011)."Computing the Two-Sided Kolmogorov–Smirnov Distribution".Journal of Statistical Software.39 (11):1–18.doi:10.18637/jss.v039.i11.
  8. ^Moscovich A, Nadler B (2017). "Fast calculation of boundary crossing probabilities for Poisson processes".Statistics and Probability Letters.123:177–182.arXiv:1503.04363.doi:10.1016/j.spl.2016.11.027.S2CID 12868694.
  9. ^abcDimitrova DS, Kaishev VK, Tan S (2020)."Computing the Kolmogorov–Smirnov Distribution when the Underlying cdf is Purely Discrete, Mixed or Continuous".Journal of Statistical Software.95 (10):1–42.doi:10.18637/jss.v095.i10.
  10. ^abcDimitrova, Dimitrina; Yun, Jia; Kaishev, Vladimir; Tan, Senren (21 May 2024)."KSgeneral: KSgeneral: Computing P-Values of the One-Sample K-S Test and the Two-Sample K-S and Kuiper Tests for (Dis)Continuous Null Distribution".CRAN.R-project.org/package=KSgeneral.
  11. ^Pearson, E. S.; Hartley, H. O., eds. (1972).Biometrika Tables for Statisticians. Vol. 2. Cambridge University Press. pp. 117–123, Tables 54, 55.ISBN 978-0-521-06937-3.
  12. ^Shorack, Galen R.; Wellner, Jon A. (1986).Empirical Processes with Applications to Statistics. Wiley. p. 239.ISBN 978-0-471-86725-8.
  13. ^Arnold, Taylor B.; Emerson, John W. (2011)."Nonparametric Goodness-of-Fit Tests for Discrete Null Distributions"(PDF).The R Journal.3 (2): 34\[Dash]39.doi:10.32614/rj-2011-016.
  14. ^"SAS/STAT(R) 14.1 User's Guide".support.sas.com. Retrieved14 April 2018.
  15. ^"ksmirnov — Kolmogorov–Smirnov equality-of-distributions test"(PDF).stata.com. Retrieved14 April 2018.
  16. ^Noether GE (1963). "Note on the Kolmogorov Statistic in the Discrete Case".Metrika.7 (1):115–116.doi:10.1007/bf02613966.S2CID 120687545.
  17. ^Slakter MJ (1965). "A Comparison of the Pearson Chi-Square and Kolmogorov Goodness-of-Fit Tests with Respect to Validity".Journal of the American Statistical Association.60 (311):854–858.doi:10.2307/2283251.JSTOR 2283251.
  18. ^Walsh JE (1963). "Bounded Probability Properties of Kolmogorov–Smirnov and Similar Statistics for Discrete Data".Annals of the Institute of Statistical Mathematics.15 (1):153–158.doi:10.1007/bf02865912.S2CID 122547015.
  19. ^Eq. (15) in Section 3.3.1 of Knuth, D.E., The Art of Computer Programming, Volume 2 (Seminumerical Algorithms), 3rd Edition, Addison Wesley, Reading Mass, 1998.
  20. ^Marozzi, Marco (2009). "Some Notes on the Location-Scale Cucconi Test".Journal of Nonparametric Statistics.21 (5):629–647.doi:10.1080/10485250902952435.S2CID 120038970.
  21. ^Marozzi, Marco (2013). "Nonparametric Simultaneous Tests for Location and Scale Testing: a Comparison of Several Methods".Communications in Statistics – Simulation and Computation.42 (6):1298–1317.doi:10.1080/03610918.2012.665546.S2CID 28146102.
  22. ^Monge, Marco (2023)."Two-Sample Kolmogorov-Smirnov Tests as Causality Tests. A narrative of Latin American inflation from 2020 to 2022".Revista Chilena de Economía y Sociedad.17 (1):68–78.
  23. ^Justel, A.; Peña, D.; Zamar, R. (1997). "A multivariate Kolmogorov–Smirnov test of goodness of fit".Statistics & Probability Letters.35 (3):251–259.CiteSeerX 10.1.1.498.7631.doi:10.1016/S0167-7152(97)00020-5.
  24. ^Peacock J.A. (1983)."Two-dimensional goodness-of-fit testing in astronomy".Monthly Notices of the Royal Astronomical Society.202 (3):615–627.Bibcode:1983MNRAS.202..615P.doi:10.1093/mnras/202.3.615.
  25. ^Gosset E. (1987). "A three-dimensional extended Kolmogorov–Smirnov test as a useful tool in astronomy}".Astronomy and Astrophysics.188 (1):258–264.Bibcode:1987A&A...188..258G.
  26. ^Fasano, G.; Franceschini, A. (1987)."A multidimensional version of the Kolmogorov–Smirnov test".Monthly Notices of the Royal Astronomical Society.225:155–170.Bibcode:1987MNRAS.225..155F.doi:10.1093/mnras/225.1.155.ISSN 0035-8711.
  27. ^Lopes, R.H.C.; Reid, I.; Hobson, P.R. (23–27 April 2007).The two-dimensional Kolmogorov–Smirnov test(PDF). XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research. Amsterdam, the Netherlands.
  28. ^"scipy.stats.kstest".SciPy v1.7.1 Manual. The Scipy community. Retrieved26 October 2021.
  29. ^"KolmogorovSmirnovTest". Retrieved18 June 2019.
  30. ^"New statistics nodes". Retrieved25 June 2020.
  31. ^"Nonparametric tests · HypothesisTests.jl".
  32. ^"ksmirnov — Kolmogorov –Smirnov equality-of-distributions test"(PDF). Retrieved18 June 2019.
  33. ^"Kolmogorov–Smirnov Test for Normality Hypothesis Testing". Retrieved18 June 2019.

Further reading

[edit]

External links

[edit]
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Biostatistics
Engineering statistics
Social statistics
Spatial statistics

Retrieved from "https://en.wikipedia.org/w/index.php?title=Kolmogorov–Smirnov_test&oldid=1337987568"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp