Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Probability mass function

From Wikipedia, the free encyclopedia
Discrete-variable probability distribution
The graph of a probability mass function. All the values of this function must be non-negative and sum up to 1.

Inprobability andstatistics, aprobability mass function (sometimes calledprobability function orfrequency function[1]) is a function that gives the probability that adiscrete random variable is exactly equal to some value.[2] Sometimes it is also known as thediscrete probability density function. The probability mass function is often the primary means of defining adiscrete probability distribution, and such functions exist for eitherscalar ormultivariate random variables whosedomain is discrete.

A probability mass function differs from acontinuous probability density function (PDF) in that the latter is associated with continuous rather than discrete random variables. A continuous PDF must beintegrated over an interval to yield a probability.[3]

The value of the random variable having the largest probability mass is called themode.

Formal definition

[edit]

Probability mass function is the probability distribution of adiscrete random variable, and provides the possible values and their associated probabilities. It is the functionp:R[0,1]{\displaystyle p:\mathbb {R} \to [0,1]} defined by

pX(x)=P(X=x){\displaystyle p_{X}(x)=P(X=x)}

for<x<{\displaystyle -\infty <x<\infty },[3] whereP{\displaystyle P} is aprobability measure.pX(x){\displaystyle p_{X}(x)} can also be simplified asp(x){\displaystyle p(x)}.[4]

The probabilities associated with all (hypothetical) values must be non-negative and sum up to 1,

xpX(x)=1{\displaystyle \sum _{x}p_{X}(x)=1} andpX(x)0.{\displaystyle p_{X}(x)\geq 0.}

Thinking of probability as mass helps to avoid mistakes since the physical mass isconserved as is the total probability for all hypothetical outcomesx{\displaystyle x}.

Measure theoretic formulation

[edit]

A probability mass function of a discrete random variableX{\displaystyle X} can be seen as a special case of two more general measure theoretic constructions: thedistribution ofX{\displaystyle X} and theprobability density function ofX{\displaystyle X} with respect to thecounting measure. We make this more precise below.

Suppose that(A,A,P){\displaystyle (A,{\mathcal {A}},P)} is aprobability spaceand that(B,B){\displaystyle (B,{\mathcal {B}})} is a measurable space whose underlyingσ-algebra is discrete, so in particular contains singleton sets ofB{\displaystyle B}. In this setting, a random variableX:AB{\displaystyle X\colon A\to B} is discrete provided its image is countable.Thepushforward measureX(P){\displaystyle X_{*}(P)}—called the distribution ofX{\displaystyle X} in this context—is a probability measure onB{\displaystyle B} whose restriction to singleton sets induces the probability mass function (as mentioned in the previous section)fX:BR{\displaystyle f_{X}\colon B\to \mathbb {R} } sincefX(b)=P(X1(b))=P(X=b){\displaystyle f_{X}(b)=P(X^{-1}(b))=P(X=b)} for eachbB{\displaystyle b\in B}.

Now suppose that(B,B,μ){\displaystyle (B,{\mathcal {B}},\mu )} is ameasure space equipped with the counting measureμ{\displaystyle \mu }. The probability density functionf{\displaystyle f} ofX{\displaystyle X} with respect to the counting measure, if it exists, is theRadon–Nikodym derivative of the pushforward measure ofX{\displaystyle X} (with respect to the counting measure), sof=dXP/dμ{\displaystyle f=dX_{*}P/d\mu } andf{\displaystyle f} is a function fromB{\displaystyle B} to the non-negative reals. As a consequence, for anybB{\displaystyle b\in B} we haveP(X=b)=P(X1(b))=X(P)(b)=bfdμ=f(b),{\displaystyle P(X=b)=P(X^{-1}(b))=X_{*}(P)(b)=\int _{b}fd\mu =f(b),}

demonstrating thatf{\displaystyle f} is in fact a probability mass function.

When there is a natural order among the potential outcomesx{\displaystyle x}, it may be convenient to assign numerical values to them (orn-tuples in case of a discretemultivariate random variable) and to consider also values not in theimage ofX{\displaystyle X}. That is,fX{\displaystyle f_{X}} may be defined for allreal numbers andfX(x)=0{\displaystyle f_{X}(x)=0} for allxX(S){\displaystyle x\notin X(S)} as shown in the figure.

The image ofX{\displaystyle X} has acountable subset on which the probability mass functionfX(x){\displaystyle f_{X}(x)} is one. Consequently, the probability mass function is zero for all but a countable number of values ofx{\displaystyle x}.

The discontinuity of probability mass functions is related to the fact that thecumulative distribution function of a discrete random variable is also discontinuous. IfX{\displaystyle X} is a discrete random variable, thenP(X=x)=1{\displaystyle P(X=x)=1} means that the casual event(X=x){\displaystyle (X=x)} is certain (it is true in 100% of the occurrences); on the contrary,P(X=x)=0{\displaystyle P(X=x)=0} means that the casual event(X=x){\displaystyle (X=x)} is always impossible. This statement isn't true for acontinuous random variableX{\displaystyle X}, for whichP(X=x)=0{\displaystyle P(X=x)=0} for any possiblex{\displaystyle x}.Discretization is the process of converting a continuous random variable into a discrete one.

Examples

[edit]
Main articles:Bernoulli distribution,Binomial distribution, andGeometric distribution

Finite

[edit]

There are three major distributions associated, theBernoulli distribution, thebinomial distribution and thegeometric distribution.

Infinite

[edit]

The following exponentially declining distribution is an example of a distribution with an infinite number of possible outcomes—all the positive integers:Pr(X=i)=12ifor i=1,2,3,{\displaystyle {\text{Pr}}(X=i)={\frac {1}{2^{i}}}\qquad {\text{for }}i=1,2,3,\dots } Despite the infinite number of possible outcomes, the total probability mass is 1/2 + 1/4 + 1/8 + ⋯ = 1, satisfying the unit total probability requirement for a probability distribution.

Multivariate case

[edit]
Main article:Joint probability distribution

Two or more discrete random variables have a joint probability mass function, which gives the probability of each possible combination of realizations for the random variables.

References

[edit]
  1. ^7.2 - Probability Mass Functions | STAT 414 - PennState - Eberly College of Science
  2. ^Stewart, William J. (2011).Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press. p. 105.ISBN 978-1-4008-3281-1.
  3. ^abA modern introduction to probability and statistics : understanding why and how. Dekking, Michel, 1946-. London: Springer. 2005.ISBN 978-1-85233-896-1.OCLC 262680588.{{cite book}}: CS1 maint: others (link)
  4. ^Rao, Singiresu S. (1996).Engineering optimization : theory and practice (3rd ed.). New York: Wiley.ISBN 0-471-55034-5.OCLC 62080932.

Further reading

[edit]
Authority control databasesEdit this at Wikidata
Retrieved from "https://en.wikipedia.org/w/index.php?title=Probability_mass_function&oldid=1317939115"
Category:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp