Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Law of total probability

From Wikipedia, the free encyclopedia
Concept in probability theory
Part of a series onstatistics
Probability theory

Inprobability theory, thelaw (orformula)of total probability is a fundamental rule relatingmarginal probabilities toconditional probabilities. It expresses the total probability of an outcome which can be realized via several distinctevents, hence the name.

Statement

[edit]

The law of total probability is[1] atheorem that states, in its discrete case, if{Bn:n=1,2,3,}{\displaystyle \left\{{B_{n}:n=1,2,3,\ldots }\right\}} is a finite orcountably infinite set ofmutually exclusive andcollectively exhaustive events, then for any eventA{\displaystyle A}

P(A)=nP(ABn){\displaystyle P(A)=\sum _{n}P(A\cap B_{n})}

or, alternatively,[1]

P(A)=nP(ABn)P(Bn),{\displaystyle P(A)=\sum _{n}P(A\mid B_{n})P(B_{n}),}

where, for anyn{\displaystyle n}, ifP(Bn)=0{\displaystyle P(B_{n})=0}, then these terms are simply omitted from the summation sinceP(ABn){\displaystyle P(A\mid B_{n})} is finite.

The summation can be interpreted as aweighted average, and consequently the marginal probability,P(A){\displaystyle P(A)}, is sometimes called "average probability";[2] "overall probability" is sometimes used in less formal writings.[3]

The law of total probability can also be stated for conditional probabilities:

P(AC)=P(A,C)P(C)=nP(A,Bn,C)P(C)=nP(ABn,C)P(BnC)P(C)P(C)=nP(ABn,C)P(BnC){\displaystyle {\begin{aligned}P({A\mid C})&={\frac {P({A,C})}{P(C)}}={\frac {\sum \limits _{n}{P({A,{B_{n}},C})}}{P(C)}}\\[2ex]&={\frac {\sum \limits _{n}P({A\mid {B_{n}},C})P({{B_{n}}\mid C})P(C)}{P(C)}}\\[2ex]&=\sum \limits _{n}P({A\mid {B_{n}},C})P({{B_{n}}\mid C})\end{aligned}}}

Taking theBn{\displaystyle B_{n}} as above, and assumingC{\displaystyle C} is an eventindependent of any of theBn{\displaystyle B_{n}}:

P(AC)=nP(AC,Bn)P(Bn){\displaystyle P(A\mid C)=\sum _{n}P(A\mid C,B_{n})P(B_{n})}

Continuous case

[edit]

The law of total probability extends to the case of conditioning on events generated by continuous random variables. Let(Ω,F,P){\displaystyle (\Omega ,{\mathcal {F}},P)} be aprobability space. SupposeX{\displaystyle X} is a random variable with distribution functionFX{\displaystyle F_{X}}, andA{\displaystyle A} an event on(Ω,F,P){\displaystyle (\Omega ,{\mathcal {F}},P)}. Then the law of total probability states

P(A)=P(A|X=x)dFX(x).{\displaystyle P(A)=\int _{-\infty }^{\infty }P(A|X=x)dF_{X}(x).}

IfX{\displaystyle X} admits a density functionfX{\displaystyle f_{X}}, then the result is

P(A)=P(A|X=x)fX(x)dx.{\displaystyle P(A)=\int _{-\infty }^{\infty }P(A|X=x)f_{X}(x)dx.}

Moreover, for the specific case whereA={YB}{\displaystyle A=\{Y\in B\}}, whereB{\displaystyle B} is aBorel set, then this yields

P(YB)=P(YB|X=x)fX(x)dx.{\displaystyle P(Y\in B)=\int _{-\infty }^{\infty }P(Y\in B|X=x)f_{X}(x)dx.}

Example

[edit]

Suppose that two factories supplylight bulbs to the market. FactoryX's bulbs work for over 5000 hours in 99% of cases, whereas factoryY's bulbs work for over 5000 hours in 95% of cases. It is known that factoryX supplies 60% of the total bulbs available and Y supplies 40% of the total bulbs available. What is the chance that a purchased bulb will work for longer than 5000 hours?

Applying the law of total probability, we have:

P(A)=P(ABX)P(BX)+P(ABY)P(BY)=99100610+95100410=594+3801000=9741000{\displaystyle {\begin{aligned}P(A)&=P(A\mid B_{X})\cdot P(B_{X})+P(A\mid B_{Y})\cdot P(B_{Y})\\[4pt]&={99 \over 100}\cdot {6 \over 10}+{95 \over 100}\cdot {4 \over 10}={{594+380} \over 1000}={974 \over 1000}\end{aligned}}}

where

Thus each purchased light bulb has a 97.4% chance to work for more than 5000 hours.

Other names

[edit]

The termlaw of total probability is sometimes taken to mean thelaw of alternatives, which is a special case of the law of total probability applying todiscrete random variables.[citation needed] One author uses the terminology of the "Rule of Average Conditional Probabilities",[4] while another refers to it as the "continuous law of alternatives" in the continuous case.[5] This result is given by Grimmett and Welsh[6] as thepartition theorem, a name that they also give to the relatedlaw of total expectation.

See also

[edit]

Notes

[edit]
  1. ^abZwillinger, D., Kokoska, S. (2000)CRC Standard Probability and Statistics Tables and Formulae, CRC Press.ISBN 1-58488-059-7 page 31.
  2. ^Pfeiffer, Paul E. (1978).Concepts of probability theory. Courier Dover Publications. pp. 47–48.ISBN 978-0-486-63677-1.
  3. ^Rumsey, Deborah (2006).Probability for dummies. For Dummies. p. 58.ISBN 978-0-471-75141-0.
  4. ^Pitman, Jim (1993).Probability. Springer. p. 41.ISBN 0-387-97974-3.
  5. ^Baclawski, Kenneth (2008).Introduction to probability with R. CRC Press. p. 179.ISBN 978-1-4200-6521-3.
  6. ^Probability: An Introduction, byGeoffrey Grimmett andDominic Welsh, Oxford Science Publications, 1986, Theorem 1B.

References

[edit]
  • Introduction to Probability and Statistics by Robert J. Beaver, Barbara M. Beaver, Thomson Brooks/Cole, 2005, page 159.
  • Theory of Statistics, by Mark J. Schervish, Springer, 1995.
  • Schaum's Outline of Probability, Second Edition, by John J. Schiller, Seymour Lipschutz, McGraw–Hill Professional, 2010, page 89.
  • A First Course in Stochastic Models, by H. C. Tijms, John Wiley and Sons, 2003, pages 431–432.
  • An Intermediate Course in Probability, by Alan Gut, Springer, 1995, pages 5–6.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Law_of_total_probability&oldid=1335746614"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp