The green curve, which asymptotically approaches heights of 0 and 1 without reaching them, is the truecumulative distribution function of thestandard normal distribution. The grey hash marks represent the observations in a particularsample drawn from that distribution, and the horizontal steps of the blue step function (including the leftmost point in each step but not including the rightmost point) form the empirical distribution function of that sample. (Click here to load a new graph.)
The empirical distribution function is anestimate of the cumulative distribution function that generated the points in the sample. It converges with probability 1 to that underlying distribution, according to theGlivenko–Cantelli theorem. A number of results exist to quantify the rate ofconvergence of the empirical distribution function to the underlying cumulative distribution function.
thus the estimator isconsistent. This expression asserts the pointwise convergence of the empirical distribution function to the true cumulative distribution function. There is a stronger result, called theGlivenko–Cantelli theorem, which states that the convergence in fact happens uniformly overt:[5]
The sup-norm in this expression is called theKolmogorov–Smirnov statistic for testing the goodness-of-fit between the empirical distribution and the assumed true cumulative distribution functionF. Othernorm functions may be reasonably used here instead of the sup-norm. For example, theL2-norm gives rise to theCramér–von Mises statistic.
The asymptotic distribution can be further characterized in several different ways. First, thecentral limit theorem states thatpointwise, has asymptotically normal distribution with the standard rate of convergence:[2]
The uniform rate of convergence in Donsker’s theorem can be quantified by the result known as theHungarian embedding:[6]
Alternatively, the rate of convergence of can also be quantified in terms of the asymptotic behavior of the sup-norm of this expression. Number of results exist in this venue, for example theDvoretzky–Kiefer–Wolfowitz inequality provides bound on the tail probabilities of:[6]
In fact, Kolmogorov has shown that if the cumulative distribution functionF is continuous, then the expression converges in distribution to, which has theKolmogorov distribution that does not depend on the form ofF.
Empirical CDF, CDF and confidence interval plots for various sample sizes ofnormal distributionEmpirical CDF, CDF and confidence interval plots for various sample sizes ofCauchy distributionEmpirical CDF, CDF and confidence interval plots for various sample sizes oftriangle distribution
As per the above bounds, we can plot the Empirical CDF, CDF and confidence intervals for different distributions by using any one of the statistical implementations.
A non-exhaustive list of software implementations of Empirical Distribution function includes:
InR software, we compute an empirical cumulative distribution function, with several methods for plotting, printing and computing with such an “ecdf” object.
InMATLAB we can use Empirical cumulative distribution function (cdf) plot
jmp from SAS, the CDF plot creates a plot of the empirical cumulative distribution function.