Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Probability-generating function

From Wikipedia, the free encyclopedia
Power series derived from a discrete probability distribution

Inprobability theory, theprobability generating function of adiscrete random variable is apower series representation (thegenerating function) of theprobability mass function of therandom variable. Probability generating functions are often employed for their succinct description of the sequence of probabilities Pr(X =i) in theprobability mass function for arandom variableX, and to make available the well-developed theory of power series with non-negative coefficients.

Definition

[edit]

Univariate case

[edit]

IfX is adiscrete random variable taking valuesx in the non-negativeintegers {0,1, ...}, then theprobability generating function ofX is defined as[1]

G(z)=E(zX)=x=0p(x)zx,{\displaystyle G(z)=\operatorname {E} (z^{X})=\sum _{x=0}^{\infty }p(x)z^{x},}wherep{\displaystyle p} is theprobability mass function ofX{\displaystyle X}. Note that the subscripted notationsGX{\displaystyle G_{X}} andpX{\displaystyle p_{X}} are often used to emphasize that these pertain to a particular random variableX{\displaystyle X}, and to itsdistribution. The power seriesconverges absolutely at least for allcomplex numbersz{\displaystyle z} with|z|<1{\displaystyle |z|<1}; the radius of convergence being often larger.

Multivariate case

[edit]

IfX = (X1,...,Xd) is a discrete random variable taking values(x1, ...,xd) in thed-dimensional non-negativeinteger lattice{0,1, ...}d, then theprobability generating function ofX is defined asG(z)=G(z1,,zd)=E(z1X1zdXd)=x1,,xd=0p(x1,,xd)z1x1zdxd,{\displaystyle G(z)=G(z_{1},\ldots ,z_{d})=\operatorname {E} {\bigl (}z_{1}^{X_{1}}\cdots z_{d}^{X_{d}}{\bigr )}=\sum _{x_{1},\ldots ,x_{d}=0}^{\infty }p(x_{1},\ldots ,x_{d})z_{1}^{x_{1}}\cdots z_{d}^{x_{d}},}wherep is the probability mass function ofX. The power series converges absolutely at least for all complex vectorsz=(z1,...zd)Cd{\displaystyle z=(z_{1},...z_{d})\in \mathbb {C} ^{d}} withmax{|z1|,...,|zd|}1.{\displaystyle {\text{max}}\{|z_{1}|,...,|z_{d}|\}\leq 1.}

Properties

[edit]

Power series

[edit]

Probability generating functions obey all the rules of power series with non-negative coefficients. In particular,G(1)=1{\displaystyle G(1^{-})=1}, whereG(1)=limx1,x<1G(x){\displaystyle G(1^{-})=\lim _{x\to 1,x<1}G(x)},x approaching 1 from below, since the probabilities must sum to one. So theradius of convergence of any probability generating function must be at least 1, byAbel's theorem for power series with non-negative coefficients.

Probabilities and expectations

[edit]

The following properties allow the derivation of various basic quantities related toX{\displaystyle X}:

  1. The probability mass function ofX{\displaystyle X} is recovered by takingderivatives ofG{\displaystyle G},p(k)=Pr(X=k)=G(k)(0)k!.{\displaystyle p(k)=\operatorname {Pr} (X=k)={\frac {G^{(k)}(0)}{k!}}.}
  2. It follows from Property 1 that if random variablesX{\displaystyle X} andY{\displaystyle Y} have probability-generating functions that are equal,GX=GY{\displaystyle G_{X}=G_{Y}}, thenpX=pY{\displaystyle p_{X}=p_{Y}}. That is, ifX{\displaystyle X} andY{\displaystyle Y} have identical probability-generating functions, then they have identical distributions.
  3. The normalization of the probability mass function can be expressed in terms of the generating function byE[1]=G(1)=i=0p(i)=1.{\displaystyle \operatorname {E} [1]=G(1^{-})=\sum _{i=0}^{\infty }p(i)=1.} Theexpectation ofX{\displaystyle X} is given byE[X]=G(1).{\displaystyle \operatorname {E} [X]=G'(1^{-}).} More generally, thekth{\displaystyle k^{th}}factorial moment,E[X(X1)(Xk+1)]{\displaystyle \operatorname {E} [X(X-1)\cdots (X-k+1)]} ofX{\displaystyle X} is given byE[X!(Xk)!]=G(k)(1),k0.{\displaystyle \operatorname {E} \left[{\frac {X!}{(X-k)!}}\right]=G^{(k)}(1^{-}),\quad k\geq 0.} So thevariance ofX{\displaystyle X} is given byVar(X)=G(1)+G(1)[G(1)]2.{\displaystyle \operatorname {Var} (X)=G''(1^{-})+G'(1^{-})-\left[G'(1^{-})\right]^{2}.} Finally, thek-thraw moment of X is given byE[Xk]=(zz)kG(z)|z=1{\displaystyle \operatorname {E} [X^{k}]=\left(z{\frac {\partial }{\partial z}}\right)^{k}G(z){\Big |}_{z=1^{-}}}
  4. GX(et)=MX(t){\displaystyle G_{X}(e^{t})=M_{X}(t)} whereX is a random variable,GX(t){\displaystyle G_{X}(t)} is the probability generating function (ofX{\displaystyle X}) andMX(t){\displaystyle M_{X}(t)} is themoment-generating function (ofX{\displaystyle X}).

Functions of independent random variables

[edit]

Probability generating functions are particularly useful for dealing with functions ofindependent random variables. For example:

  • IfXi,i=1,2,,N{\displaystyle X_{i},i=1,2,\cdots ,N} is a sequence of independent (and not necessarily identically distributed) random variables that take on natural-number values, and

SN=i=1NaiXi,{\displaystyle S_{N}=\sum _{i=1}^{N}a_{i}X_{i},} where theai{\displaystyle a_{i}} are constant natural numbers, then the probability generating function is given byGSN(z)=E(zSN)=E(zi=1NaiXi,)=GX1(za1)GX2(za2)GXN(zaN).{\displaystyle G_{S_{N}}(z)=\operatorname {E} (z^{S_{N}})=\operatorname {E} \left(z^{\sum _{i=1}^{N}a_{i}X_{i},}\right)=G_{X_{1}}(z^{a_{1}})G_{X_{2}}(z^{a_{2}})\cdots G_{X_{N}}(z^{a_{N}}).}

GX+Y(z)=GX(z)GY(z){\displaystyle G_{X+Y}(z)=G_{X}(z)\cdot G_{Y}(z)} andGXY(z)=GX(z)GY(1/z).{\displaystyle G_{X-Y}(z)=G_{X}(z)\cdot G_{Y}(1/z).}

GSN(z)=GN(GX(z)).{\displaystyle G_{S_{N}}(z)=G_{N}(G_{X}(z)).} This can be seen, using thelaw of total expectation, as follows:GSN(z)=E(zSN)=E(zi=1NXi)=E(E(zi=1NXiN))=E((GX(z))N)=GN(GX(z)).{\displaystyle {\begin{aligned}G_{S_{N}}(z)&=\operatorname {E} (z^{S_{N}})=\operatorname {E} (z^{\sum _{i=1}^{N}X_{i}})\\[4pt]&=\operatorname {E} {\big (}\operatorname {E} (z^{\sum _{i=1}^{N}X_{i}}\mid N){\big )}=\operatorname {E} {\big (}(G_{X}(z))^{N}{\big )}=G_{N}(G_{X}(z)).\end{aligned}}}This last fact is useful in the study ofGalton–Watson processes andcompound Poisson processes.

GSN(z)=n1fni=1nGXi(z),{\displaystyle G_{S_{N}}(z)=\sum _{n\geq 1}f_{n}\prod _{i=1}^{n}G_{X_{i}}(z),} wherefn=Pr(N=n).{\displaystyle f_{n}=\Pr(N=n).} For identically distributedXi{\displaystyle X_{i}}s, this simplifies to the identity stated before, but the general case is sometimes useful to obtain a decomposition ofSN{\displaystyle S_{N}} by means of generating functions.

Examples

[edit]

Related concepts

[edit]

The probability generating function is an example of agenerating function of a sequence: see alsoformal power series. It is equivalent to, and sometimes called, thez-transform of the probability mass function.

Other generating functions of random variables include themoment-generating function, thecharacteristic function and thecumulant generating function. The probability generating function is also equivalent to thefactorial moment generating function, which asE[zX]{\displaystyle \operatorname {E} \left[z^{X}\right]} can also be considered for continuous and other random variables.

This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Probability-generating function" – news ·newspapers ·books ·scholar ·JSTOR
(April 2012) (Learn how and when to remove this message)

Notes

[edit]
  1. ^Gleb Gribakin.Probability and Distribution Theory(PDF).

References

[edit]
  • Johnson, Norman Lloyd; Kotz, Samuel;Kemp, Adrienne W. (1992).Univariate Discrete Distributions. Wiley series in probability and mathematical statistics (2nd ed.). New York: J. Wiley & Sons.ISBN 978-0-471-54897-3.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Probability-generating_function&oldid=1287526662"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp