Inprobability theory,Cantelli's inequality (also called theChebyshev-Cantelli inequality and theone-sided Chebyshev inequality) is an improved version ofChebyshev's inequality for one-sided tail bounds.[1][2][3] The inequality states that, for
Applying the Cantelli inequality to gives a bound on the lower tail,
While the inequality is often attributed toFrancesco Paolo Cantelli who published it in 1928,[4] it originates in Chebyshev's work of 1874.[5] When bounding the event random variable deviates from itsmean in only one direction (positive or negative), Cantelli's inequality gives an improvement over Chebyshev's inequality. The Chebyshev inequality has"higher moments versions" and"vector versions", and so does the Cantelli inequality.
For one-sided tail bounds, Cantelli's inequality is better, since Chebyshev's inequality can only get
On the other hand, for two-sided tail bounds, Cantelli's inequality gives
which is always worse than Chebyshev's inequality (when; otherwise, both inequalities bound a probability by a value greater than one, and so are trivial).
^Boucheron, Stéphane (2013).Concentration inequalities : a nonasymptotic theory of independence. Gábor Lugosi, Pascal Massart. Oxford: Oxford University Press.ISBN978-0-19-953525-5.OCLC829910957.
^He, S.; Zhang, J.; Zhang, S. (2010). "Bounding probability of small deviation: A fourth moment approach".Mathematics of Operations Research.35 (1):208–232.doi:10.1287/moor.1090.0438.S2CID11298475.