Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Poisson distribution

Page semi-protected
From Wikipedia, the free encyclopedia
Discrete probability distribution

Poisson distribution
Probability mass function
The horizontal axis is the indexk, the number of occurrences.λ is the expected rate of occurrences. The vertical axis is the probability ofk occurrences givenλ. The function is defined only at integer values ofk; the connecting lines are only guides for the eye.
Cumulative distribution function
The horizontal axis is the indexk, the number of occurrences. The CDF is discontinuous at the integers ofk and flat everywhere else because a variable that is Poisson distributed takes on only integer values.
NotationPois(λ){\displaystyle \operatorname {Pois} (\lambda )}
Parametersλ(0,){\displaystyle \lambda \in (0,\infty )} (rate)
SupportkN0{\displaystyle k\in \mathbb {N} _{0}} (Natural numbers starting from 0)
PMFλkeλk!{\displaystyle {\frac {\lambda ^{k}e^{-\lambda }}{k!}}}
CDF

Γ(k+1,λ)k!,{\displaystyle {\frac {\Gamma (\lfloor k+1\rfloor ,\lambda )}{\lfloor k\rfloor !}},} oreλj=0kλjj!,{\displaystyle e^{-\lambda }\sum _{j=0}^{\lfloor k\rfloor }{\frac {\lambda ^{j}}{j!}},} orQ(k+1,λ){\displaystyle Q(\lfloor k+1\rfloor ,\lambda )}

(fork0,{\displaystyle k\geq 0,} whereΓ(x,y){\displaystyle \Gamma (x,y)} is theupper incomplete gamma function,k{\displaystyle \lfloor k\rfloor } is thefloor function, andQ{\displaystyle Q} is theregularized gamma function)
Meanλ{\displaystyle \lambda }
Medianλ+13150λ{\displaystyle \approx \left\lfloor \lambda +{\frac {1}{3}}-{\frac {1}{50\lambda }}\right\rfloor }
Modeλ1,λ{\displaystyle \left\lceil \lambda \right\rceil -1,\left\lfloor \lambda \right\rfloor }
Varianceλ{\displaystyle \lambda }
Skewness1λ{\displaystyle {\frac {1}{\sqrt {\lambda }}}}
Excess kurtosis1λ{\displaystyle {\frac {1}{\lambda }}}
Entropy

λ[1log(λ)]+eλk=0λklog(k!)k!{\displaystyle \lambda {\Bigl [}1-\log(\lambda ){\Bigr ]}+e^{-\lambda }\sum _{k=0}^{\infty }{\frac {\lambda ^{k}\log(k!)}{k!}}}  or for largeλ{\displaystyle \lambda }

12log(2πeλ)112λ124λ219360λ3+O(1λ4){\displaystyle {\begin{aligned}\approx {\frac {1}{2}}\log \left(2\pi e\lambda \right)-{\frac {1}{12\lambda }}-{\frac {1}{24\lambda ^{2}}}\\-{\frac {19}{360\lambda ^{3}}}+{\mathcal {O}}\left({\frac {1}{\lambda ^{4}}}\right)\end{aligned}}}
MGFexp[λ(et1)]{\displaystyle \exp \left[\lambda \left(e^{t}-1\right)\right]}
CFexp[λ(eit1)]{\displaystyle \exp \left[\lambda \left(e^{it}-1\right)\right]}
PGFexp[λ(z1)]{\displaystyle \exp \left[\lambda \left(z-1\right)\right]}
Fisher information1λ{\displaystyle {\frac {1}{\lambda }}}

Inprobability theory andstatistics, thePoisson distribution (/ˈpwɑːsɒn/) is adiscrete probability distribution that expresses the probability of a given number ofevents occurring in a fixed interval of time if these events occur with a known constant mean rate andindependently of the time since the last event.[1] It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1 (e.g., number of events in a given area or volume).The Poisson distribution is named afterFrench mathematicianSiméon Denis Poisson. It plays an important role fordiscrete-stable distributions.

Under a Poisson distribution with theexpectation ofλ events in a given interval, the probability ofk events in the same interval is:[2]: 60 λkeλk!.{\displaystyle {\frac {\lambda ^{k}e^{-\lambda }}{k!}}.}For instance, consider a call center which receives an average ofλ = 3 calls per minute at all times of day. If the number of calls received in any two given disjoint time intervals is independent, then the numberk of calls received during any minute has a Poisson probability distribution. Receivingk = 1 to 4 calls then has a probability of about 0.77, while receiving 0 or at least 5 calls has a probability of about 0.23.

A classic example used to motivate the Poisson distribution is the number ofradioactive decay events during a fixed observation period.[3]

History

The distribution was first introduced bySiméon Denis Poisson (1781–1840) and published together with his probability theory in his workRecherches sur la probabilité des jugements en matière criminelle et en matière civile (1837).[4]: 205-207  The work theorized about the number of wrongful convictions in a given country by focusing on certainrandom variablesN that count, among other things, the number of discrete occurrences (sometimes called "events" or "arrivals") that take place during atime-interval of given length. The result had already been given in 1711 byAbraham de Moivre inDe Mensura Sortis seu; de Probabilitate Eventuum in Ludis a Casu Fortuito Pendentibus .[5]: 219 [6]: 14-15 [7]: 193 [8]: 157  This makes it an example ofStigler's law and it has prompted some authors to argue that the Poisson distribution should bear the name of de Moivre.[9][10]

In 1860,Simon Newcomb fitted the Poisson distribution to the number of stars found in a unit of space.[11]A further practical application was made byLadislaus Bortkiewicz in 1898. Bortkiewicz showed that the frequency with which soldiers in the Prussian army were accidentally killed by horse kicks could be well modeled by a Poisson distribution.[12]: 23-25 .

Definitions

Probability mass function

A discreterandom variableX is said to have a Poisson distribution with parameterλ>0{\displaystyle \lambda >0} if it has aprobability mass function given by:[2]: 60 f(k;λ)=Pr(X=k)=λkeλk!,{\displaystyle f(k;\lambda )=\Pr(X{=}k)={\frac {\lambda ^{k}e^{-\lambda }}{k!}},}where

The positivereal numberλ is equal to theexpected value ofX and also to itsvariance.[13]λ=E(X)=Var(X).{\displaystyle \lambda =\operatorname {E} (X)=\operatorname {Var} (X).}

The Poisson distribution can be applied to systems with alarge number of possible events, each of which is rare. The number of such events that occur during a fixed time interval is, under the right circumstances, a random number with a Poisson distribution.

The equation can be adapted if, instead of the average number of eventsλ,{\displaystyle \lambda ,} we are given the average rater{\displaystyle r} at which events occur. Thenλ=rt,{\displaystyle \lambda =rt,} and:[14]P(k events in interval t)=(rt)kertk!.{\displaystyle P(k{\text{ events in interval }}t)={\frac {(rt)^{k}e^{-rt}}{k!}}.}

Examples

Chewing gum on a sidewalk in Reykjavík.
Chewing gum on a sidewalk. The number of pieces on a single tile is approximately Poisson distributed.

The Poisson distribution may be useful to model events such as:

  • the number of meteorites greater than one-meter diameter that strike Earth in a year;
  • the number of laser photons hitting a detector in a particular time interval;
  • the number of students achieving a low and high mark in an exam; and
  • locations of defects and dislocations in materials.

Examples of the occurrence of random points in space are: the locations of asteroid impacts with earth (2-dimensional), the locations of imperfections in a material (3-dimensional), and the locations of trees in a forest (2-dimensional).[15]

Assumptions and validity

The Poisson distribution is an appropriate model if the following assumptions are true:

  • k, a nonnegative integer, is the number of times an event occurs in an interval.
  • The occurrence of one eventdoes not affect the probability of a second event.
  • The average rate at which events occur is independent of any occurrences.
  • Two events cannot occur at exactly the same instant.

If these conditions are true, thenk is a Poisson random variable; the distribution ofk is a Poisson distribution.

The Poisson distribution is also thelimit of abinomial distribution, for which the probability of success for each trial isp=λn{\displaystyle p={\frac {\lambda }{n}}}, whereλ{\displaystyle \lambda } is the expectation andn{\displaystyle n} is the number of trials, in the limit thatn{\displaystyle n\to \infty } withλ{\displaystyle \lambda } kept constant[16][17] (seeRelated distributions):limn(nk)(λn)k(1λn)nk=λkk!eλ{\displaystyle \lim _{n\to \infty }{\dbinom {n}{k}}\left({\frac {\lambda }{n}}\right)^{k}\,\left(1-{\frac {\lambda }{n}}\right)^{n-k}={\frac {\lambda ^{k}}{k!}}\,e^{-\lambda }}The Poisson distribution may also be derived from the differential equations[18][19][20]dPk(t)dt=λ(Pk1(t)Pk(t)){\displaystyle {\frac {d\,P_{k}(t)}{dt}}=\lambda \,{\Big (}P_{k-1}(t)-P_{k}(t){\Big )}}with initial conditionsPk(0)=δk0{\displaystyle P_{k}(0)=\delta _{k0}} and evaluated att=1{\displaystyle t=1}

Examples of probability for Poisson distributions

On a particular river, overflow floods occur once every 100 years on average. Calculate the probability ofk = 0, 1, 2, 3, 4, 5, or 6 overflow floods in a 100-year interval, assuming the Poisson model is appropriate.

Because the average event rate is one overflow flood per 100 years,λ = 1

P(k overflow floods in 100 years)=λkeλk!=1ke1k!P(k=0 overflow floods in 100 years)=10e10!=e110.368P(k=1 overflow flood in 100 years)=11e11!=e110.368P(k=2 overflow floods in 100 years)=12e12!=e120.184{\displaystyle {\begin{aligned}P(k{\text{ overflow floods in 100 years}})&={\frac {\lambda ^{k}e^{-\lambda }}{k!}}={\frac {1^{k}e^{-1}}{k!}}\\P(k=0{\text{ overflow floods in 100 years}})&={\frac {1^{0}e^{-1}}{0!}}={\frac {e^{-1}}{1}}\approx 0.368\\P(k=1{\text{ overflow flood in 100 years}})&={\frac {1^{1}e^{-1}}{1!}}={\frac {e^{-1}}{1}}\approx 0.368\\P(k=2{\text{ overflow floods in 100 years}})&={\frac {1^{2}e^{-1}}{2!}}={\frac {e^{-1}}{2}}\approx 0.184\end{aligned}}}


kP(k overflow floods in 100 years)
00.368
10.368
20.184
30.061
40.015
50.003
60.0005

The probability for 0 to 6 overflow floods in a 100-year period.

In this example, it is reported that the average number of goals in a World Cup soccer match is approximately 2.5 and the Poisson model is appropriate.[21]Because the average event rate is 2.5 goals per match,λ = 2.5.

P(k goals in a match)=2.5ke2.5k!P(k=0 goals in a match)=2.50e2.50!=e2.510.082P(k=1 goal in a match)=2.51e2.51!=2.5e2.510.205P(k=2 goals in a match)=2.52e2.52!=6.25e2.520.257{\displaystyle {\begin{aligned}P(k{\text{ goals in a match}})&={\frac {2.5^{k}e^{-2.5}}{k!}}\\P(k=0{\text{ goals in a match}})&={\frac {2.5^{0}e^{-2.5}}{0!}}={\frac {e^{-2.5}}{1}}\approx 0.082\\P(k=1{\text{ goal in a match}})&={\frac {2.5^{1}e^{-2.5}}{1!}}={\frac {2.5e^{-2.5}}{1}}\approx 0.205\\P(k=2{\text{ goals in a match}})&={\frac {2.5^{2}e^{-2.5}}{2!}}={\frac {6.25e^{-2.5}}{2}}\approx 0.257\end{aligned}}}


kP(k goals in a World Cup soccer match)
00.082
10.205
20.257
30.213
40.133
50.067
60.028
70.010

The probability for 0 to 7 goals in a match.

Examples that violate the Poisson assumptions

The number of students who arrive at thestudent union per minute will likely not follow a Poisson distribution, because the rate is not constant (low rate during class time, high rate between class times) and the arrivals of individual students are not independent (students tend to come in groups). The non-constant arrival rate may be modeled as amixed Poisson distribution, and the arrival of groups rather than individual students as acompound Poisson process.

The number of magnitude 5 earthquakes per year in a country may not follow a Poisson distribution if one large earthquake increases the probability of aftershocks of similar magnitude.

Examples in which at least one event is guaranteed are not Poisson distributed; but may be modeled using azero-truncated Poisson distribution.

Count distributions in which the number of intervals with zero events is higher than predicted by a Poisson model may be modeled using azero-inflated model.

Properties

Descriptive statistics

Median

Bounds for the median (ν{\displaystyle \nu }) of the distribution are known and aresharp:[23]λln2ν<λ+13.{\displaystyle \lambda -\ln 2\leq \nu <\lambda +{\frac {1}{3}}.}

Higher moments

The higher non-centeredmomentsmk of the Poisson distribution areTouchard polynomials inλ:mk=i=0kλi{ki},{\displaystyle m_{k}=\sum _{i=0}^{k}\lambda ^{i}{\begin{Bmatrix}k\\i\end{Bmatrix}},} where the braces { } denoteStirling numbers of the second kind.[24][1]: 6  In other words,E[X]=λ,E[X(X1)]=λ2,E[X(X1)(X2)]=λ3,{\displaystyle E[X]=\lambda ,\quad E[X(X-1)]=\lambda ^{2},\quad E[X(X-1)(X-2)]=\lambda ^{3},\cdots }When the expected value is set toλ = 1,Dobinski's formula implies that then‑th moment is equal to the number ofpartitions of a set of sizen.

A simple upper bound is:[25]mk=E[Xk](klog(k/λ+1))kλkexp(k22λ).{\displaystyle m_{k}=E[X^{k}]\leq \left({\frac {k}{\log(k/\lambda +1)}}\right)^{k}\leq \lambda ^{k}\exp \left({\frac {k^{2}}{2\lambda }}\right).}

Sums of Poisson-distributed random variables

IfXiPois(λi){\displaystyle X_{i}\sim \operatorname {Pois} (\lambda _{i})} fori=1,,n{\displaystyle i=1,\dotsc ,n} areindependent, theni=1nXiPois(i=1nλi).{\textstyle \sum _{i=1}^{n}X_{i}\sim \operatorname {Pois} \left(\sum _{i=1}^{n}\lambda _{i}\right).}[26]: 65  A converse isRaikov's theorem, which says that if the sum of two independent random variables is Poisson-distributed, then so are each of those two independent random variables.[27][28]

Maximum entropy

It is amaximum-entropy distribution among the set of generalized binomial distributionsBn(λ){\displaystyle B_{n}(\lambda )} with meanλ{\displaystyle \lambda } andn{\displaystyle n\to \infty },[29] where a generalized binomial distribution is defined as a distribution of the sum of N independent but not identically distributed Bernoulli variables.

Other properties

Φ(sign(kλ)2DKL(QP))<P(Xk)<Φ(sign(k+1λ)2DKL(Q+P)), for k>0,{\displaystyle \Phi {\left(\operatorname {sign} (k-\lambda ){\sqrt {2\operatorname {D} _{\text{KL}}(Q_{-}\parallel P)}}\right)}<P(X\leq k)<\Phi {\left(\operatorname {sign} (k+1-\lambda ){\sqrt {2\operatorname {D} _{\text{KL}}(Q_{+}\parallel P)}}\right)},{\text{ for }}k>0,} whereDKL(QP){\displaystyle \operatorname {D} _{\text{KL}}(Q_{-}\parallel P)} is the Kullback–Leibler divergence ofQ=Pois(k){\displaystyle Q_{-}=\operatorname {Pois} (k)} fromP=Pois(λ){\displaystyle P=\operatorname {Pois} (\lambda )} andDKL(Q+P){\displaystyle \operatorname {D} _{\text{KL}}(Q_{+}\parallel P)} is the Kullback–Leibler divergence ofQ+=Pois(k+1){\displaystyle Q_{+}=\operatorname {Pois} (k+1)} fromP{\displaystyle P}.

Poisson races

LetXPois(λ){\displaystyle X\sim \operatorname {Pois} (\lambda )} andYPois(μ){\displaystyle Y\sim \operatorname {Pois} (\mu )} be independent random variables, withλ<μ,{\displaystyle \lambda <\mu ,} then we have thate(μλ)2(λ+μ)2e(λ+μ)2λμe(λ+μ)4λμP(XY0)e(μλ)2{\displaystyle {\frac {e^{-({\sqrt {\mu }}-{\sqrt {\lambda }})^{2}}}{(\lambda +\mu )^{2}}}-{\frac {e^{-(\lambda +\mu )}}{2{\sqrt {\lambda \mu }}}}-{\frac {e^{-(\lambda +\mu )}}{4\lambda \mu }}\leq P(X-Y\geq 0)\leq e^{-({\sqrt {\mu }}-{\sqrt {\lambda }})^{2}}}

The upper bound is proved using a standard Chernoff bound.

The lower bound can be proved by noting thatP(XY0X+Y=i){\displaystyle P(X-Y\geq 0\mid X+Y=i)} is the probability thatZi2,{\textstyle Z\geq {\frac {i}{2}},} whereZBin(i,λλ+μ),{\textstyle Z\sim \operatorname {Bin} \left(i,{\frac {\lambda }{\lambda +\mu }}\right),} which is bounded below by1(i+1)2eiD(0.5λλ+μ),{\textstyle {\frac {1}{(i+1)^{2}}}e^{-iD\left(0.5\|{\frac {\lambda }{\lambda +\mu }}\right)},} whereD{\displaystyle D} isrelative entropy (See the entry onbounds on tails of binomial distributions for details). Further noting thatX+YPois(λ+μ),{\displaystyle X+Y\sim \operatorname {Pois} (\lambda +\mu ),} and computing a lower bound on the unconditional probability gives the result. More details can be found in the appendix of Kamath et al.[35]

Related distributions

As a Binomial distribution with infinitesimal time-steps

The Poisson distribution can be derived as a limiting case to thebinomial distribution as the number of trials goes to infinity and theexpected number of successes remains fixed — seelaw of rare events below. Therefore, it can be used as an approximation of the binomial distribution ifn is sufficiently large andp is sufficiently small. The Poisson distribution is a good approximation of the binomial distribution ifn is at least 20 andp is smaller than or equal to 0.05, and an excellent approximation ifn ≥ 100 andnp ≤ 10.[36] LettingFB{\displaystyle F_{\mathrm {B} }} andFP{\displaystyle F_{\mathrm {P} }} be the respectivecumulative density functions of the binomial and Poisson distributions, one has:FB(k;n,p)  FP(k;λ=np).{\displaystyle F_{\mathrm {B} }(k;n,p)\ \approx \ F_{\mathrm {P} }(k;\lambda =np).}One derivation of this usesprobability-generating functions.[37] Consider aBernoulli trial (coin-flip) whose probability of one success (or expected number of successes) isλ1{\displaystyle \lambda \leq 1} within a given interval. Split the interval inton parts, and perform a trial in each subinterval with probabilityλn{\displaystyle {\tfrac {\lambda }{n}}}. The probability ofk successes out ofn trials over the entire interval is then given by the binomial distributionpk(n)=(nk)(λn)k(1λn)nk,{\displaystyle p_{k}^{(n)}={\binom {n}{k}}\left({\frac {\lambda }{n}}\right)^{\!k}\left(1{-}{\frac {\lambda }{n}}\right)^{\!n-k},}whose generating function is:P(n)(x)=k=0npk(n)xk=(1λn+λnx)n.{\displaystyle P^{(n)}(x)=\sum _{k=0}^{n}p_{k}^{(n)}x^{k}=\left(1-{\frac {\lambda }{n}}+{\frac {\lambda }{n}}x\right)^{n}.}Taking the limit asn increases to infinity (withx fixed) and applying the product limit definition of theexponential function, this reduces to the generating function of the Poisson distribution:limnP(n)(x)=limn(1+λ(x1)n)n=eλ(x1)=k=0eλλkk!xk.{\displaystyle \lim _{n\to \infty }P^{(n)}(x)=\lim _{n\to \infty }\left(1{+}{\tfrac {\lambda (x-1)}{n}}\right)^{n}=e^{\lambda (x-1)}=\sum _{k=0}^{\infty }e^{-\lambda }{\frac {\lambda ^{k}}{k!}}x^{k}.}

General

Poisson approximation

AssumeX1Pois(λ1),X2Pois(λ2),,XnPois(λn){\displaystyle X_{1}\sim \operatorname {Pois} (\lambda _{1}),X_{2}\sim \operatorname {Pois} (\lambda _{2}),\dots ,X_{n}\sim \operatorname {Pois} (\lambda _{n})} whereλ1+λ2++λn=1,{\displaystyle \lambda _{1}+\lambda _{2}+\dots +\lambda _{n}=1,} then[43](X1,X2,,Xn){\displaystyle (X_{1},X_{2},\dots ,X_{n})} ismultinomially distributed(X1,X2,,Xn)Mult(N,λ1,λ2,,λn){\displaystyle (X_{1},X_{2},\dots ,X_{n})\sim \operatorname {Mult} (N,\lambda _{1},\lambda _{2},\dots ,\lambda _{n})} conditioned onN=X1+X2+Xn.{\displaystyle N=X_{1}+X_{2}+\dots X_{n}.}

This means[32]: 101-102 , among other things, that for any nonnegative functionf(x1,x2,,xn),{\displaystyle f(x_{1},x_{2},\dots ,x_{n}),}if(Y1,Y2,,Yn)Mult(m,p){\displaystyle (Y_{1},Y_{2},\dots ,Y_{n})\sim \operatorname {Mult} (m,\mathbf {p} )} is multinomially distributed, thenE[f(Y1,Y2,,Yn)]emE[f(X1,X2,,Xn)]{\displaystyle \operatorname {E} [f(Y_{1},Y_{2},\dots ,Y_{n})]\leq e{\sqrt {m}}\operatorname {E} [f(X_{1},X_{2},\dots ,X_{n})]}where(X1,X2,,Xn)Pois(p).{\displaystyle (X_{1},X_{2},\dots ,X_{n})\sim \operatorname {Pois} (\mathbf {p} ).}

The factor ofem{\displaystyle e{\sqrt {m}}} can be replaced by 2 iff{\displaystyle f} is further assumed to be monotonically increasing or decreasing.

Bivariate Poisson distribution

This distribution has been extended to thebivariate case.[44] Thegenerating function for this distribution isg(u,v)=exp[(θ1θ12)(u1)+(θ2θ12)(v1)+θ12(uv1)]{\displaystyle g(u,v)=\exp[(\theta _{1}-\theta _{12})(u-1)+(\theta _{2}-\theta _{12})(v-1)+\theta _{12}(uv-1)]}withθ1,θ2>θ12>0{\displaystyle \theta _{1},\theta _{2}>\theta _{12}>0}

The marginal distributions arePoisson(θ1) andPoisson(θ2) and the correlation coefficient is limited to the range0ρmin{θ1θ2,θ2θ1}{\displaystyle 0\leq \rho \leq \min \left\{{\sqrt {\frac {\theta _{1}}{\theta _{2}}}},{\sqrt {\frac {\theta _{2}}{\theta _{1}}}}\right\}}

A simple way to generate a bivariate Poisson distributionX1,X2{\displaystyle X_{1},X_{2}} is to take three independent Poisson distributionsY1,Y2,Y3{\displaystyle Y_{1},Y_{2},Y_{3}} with meansλ1,λ2,λ3{\displaystyle \lambda _{1},\lambda _{2},\lambda _{3}} and then setX1=Y1+Y3,X2=Y2+Y3.{\displaystyle X_{1}=Y_{1}+Y_{3},X_{2}=Y_{2}+Y_{3}.} The probability function of the bivariate Poisson distribution isPr(X1=k1,X2=k2)=exp(λ1λ2λ3)λ1k1k1!λ2k2k2!k=0min(k1,k2)(k1k)(k2k)k!(λ3λ1λ2)k{\displaystyle \Pr(X_{1}=k_{1},X_{2}=k_{2})=\exp \left(-\lambda _{1}-\lambda _{2}-\lambda _{3}\right){\frac {\lambda _{1}^{k_{1}}}{k_{1}!}}{\frac {\lambda _{2}^{k_{2}}}{k_{2}!}}\sum _{k=0}^{\min(k_{1},k_{2})}{\binom {k_{1}}{k}}{\binom {k_{2}}{k}}k!\left({\frac {\lambda _{3}}{\lambda _{1}\lambda _{2}}}\right)^{k}}

Free Poisson distribution

The free Poisson distribution[45] with jump sizeα{\displaystyle \alpha } and rateλ{\displaystyle \lambda } arises infree probability theory as the limit of repeatedfree convolution((1λN)δ0+λNδα)N{\displaystyle \left(\left(1-{\frac {\lambda }{N}}\right)\delta _{0}+{\frac {\lambda }{N}}\delta _{\alpha }\right)^{\boxplus N}}asN → ∞.

In other words, letXN{\displaystyle X_{N}} be random variables so thatXN{\displaystyle X_{N}} has valueα{\displaystyle \alpha } with probabilityλN{\textstyle {\frac {\lambda }{N}}} and value 0 with the remaining probability. Assume also that the familyX1,X2,{\displaystyle X_{1},X_{2},\ldots } arefreely independent. Then the limit asN{\displaystyle N\to \infty } of the law ofX1++XN{\displaystyle X_{1}+\cdots +X_{N}} is given by the Free Poisson law with parametersλ,α.{\displaystyle \lambda ,\alpha .}

This definition is analogous to one of the ways in which the classical Poisson distribution is obtained from a (classical) Poisson process.

The measure associated to the free Poisson law is given by[46]μ={(1λ)δ0+ν,if 0λ1ν,if λ>1,{\displaystyle \mu ={\begin{cases}(1-\lambda )\delta _{0}+\nu ,&{\text{if }}0\leq \lambda \leq 1\\\nu ,&{\text{if }}\lambda >1,\end{cases}}}whereν=12παt4λα2(tα(1+λ))2dt{\displaystyle \nu ={\frac {1}{2\pi \alpha t}}{\sqrt {4\lambda \alpha ^{2}-(t-\alpha (1+\lambda ))^{2}}}\,dt}and has support[α(1λ)2,α(1+λ)2].{\displaystyle [\alpha (1-{\sqrt {\lambda }})^{2},\alpha (1+{\sqrt {\lambda }})^{2}].}

This law also arises inrandom matrix theory as theMarchenko–Pastur law. Itsfree cumulants are equal toκn=λαn.{\displaystyle \kappa _{n}=\lambda \alpha ^{n}.}

Some transforms of this law

We give values of some important transforms of the free Poisson law; the computation can be found in e.g. in the bookLectures on the Combinatorics of Free Probability by A. Nica and R. Speicher[47]

The R-transform of the free Poisson law is given byR(z)=λα1αz.{\displaystyle R(z)={\frac {\lambda \alpha }{1-\alpha z}}.}

The Cauchy transform (which is the negative of theStieltjes transformation) is given byG(z)=z+αλα(zα(1+λ))24λα22αz{\displaystyle G(z)={\frac {z+\alpha -\lambda \alpha -{\sqrt {(z-\alpha (1+\lambda ))^{2}-4\lambda \alpha ^{2}}}}{2\alpha z}}}

The S-transform is given byS(z)=1z+λ{\displaystyle S(z)={\frac {1}{z+\lambda }}}in the case thatα=1.{\displaystyle \alpha =1.}

Statistical inference

See also:Poisson regression

Parameter estimation

Given a sample ofn measured valueski{0,1,},{\displaystyle k_{i}\in \{0,1,\dots \},} fori = 1, ...,n, we wish to estimate the value of the parameterλ of the Poisson population from which the sample was drawn. Themaximum likelihood estimate is[48]

λ^MLE=1ni=1nki .{\displaystyle {\widehat {\lambda }}_{\mathrm {MLE} }={\frac {1}{n}}\sum _{i=1}^{n}k_{i}\ .}

Since each observation has expectationλ, so does the sample mean. Therefore, the maximum likelihood estimate is anunbiased estimator ofλ. It is also an efficient estimator since its variance achieves theCramér–Rao lower bound (CRLB).[49] Hence it isminimum-variance unbiased. Also it can be proven that the sum (and hence the sample mean as it is a one-to-one function of the sum) is a complete and sufficient statistic forλ.

To prove sufficiency we may use thefactorization theorem. Consider partitioning the probability mass function of the joint Poisson distribution for the sample into two parts: one that depends solely on the samplex{\displaystyle \mathbf {x} }, calledh(x){\displaystyle h(\mathbf {x} )}, and one that depends on the parameterλ{\displaystyle \lambda } and the samplex{\displaystyle \mathbf {x} } only through the functionT(x).{\displaystyle T(\mathbf {x} ).} ThenT(x){\displaystyle T(\mathbf {x} )} is a sufficient statistic forλ.{\displaystyle \lambda .}

P(x)=i=1nλxieλxi!=1i=1nxi!×λi=1nxienλ{\displaystyle P(\mathbf {x} )=\prod _{i=1}^{n}{\frac {\lambda ^{x_{i}}e^{-\lambda }}{x_{i}!}}={\frac {1}{\prod _{i=1}^{n}x_{i}!}}\times \lambda ^{\sum _{i=1}^{n}x_{i}}e^{-n\lambda }}

The first termh(x){\displaystyle h(\mathbf {x} )} depends only onx{\displaystyle \mathbf {x} }. The second termg(T(x)|λ){\displaystyle g(T(\mathbf {x} )|\lambda )} depends on the sample only throughT(x)=i=1nxi.{\textstyle T(\mathbf {x} )=\sum _{i=1}^{n}x_{i}.} Thus,T(x){\displaystyle T(\mathbf {x} )} is sufficient.

To find the parameterλ that maximizes the probability function for the Poisson population, we can use the logarithm of the likelihood function:

(λ)=lni=1nf(kiλ)=i=1nln(eλλkiki!)=nλ+(i=1nki)ln(λ)i=1nln(ki!).{\displaystyle {\begin{aligned}\ell (\lambda )&=\ln \prod _{i=1}^{n}f(k_{i}\mid \lambda )\\&=\sum _{i=1}^{n}\ln \!\left({\frac {e^{-\lambda }\lambda ^{k_{i}}}{k_{i}!}}\right)\\&=-n\lambda +\left(\sum _{i=1}^{n}k_{i}\right)\ln(\lambda )-\sum _{i=1}^{n}\ln(k_{i}!).\end{aligned}}}

We take the derivative of{\displaystyle \ell } with respect toλ and compare it to zero:

ddλ(λ)=0n+(i=1nki)1λ=0.{\displaystyle {\frac {\mathrm {d} }{\mathrm {d} \lambda }}\ell (\lambda )=0\iff -n+\left(\sum _{i=1}^{n}k_{i}\right){\frac {1}{\lambda }}=0.\!}

Solving forλ gives a stationary point.

λ=i=1nkin{\displaystyle \lambda ={\frac {\sum _{i=1}^{n}k_{i}}{n}}}

Soλ is the average of theki values. Obtaining the sign of the second derivative ofL at the stationary point will determine what kind of extreme valueλ is.

2λ2=λ2i=1nki{\displaystyle {\frac {\partial ^{2}\ell }{\partial \lambda ^{2}}}=-\lambda ^{-2}\sum _{i=1}^{n}k_{i}}

Evaluating the second derivativeat the stationary point gives:

2λ2=n2i=1nki{\displaystyle {\frac {\partial ^{2}\ell }{\partial \lambda ^{2}}}=-{\frac {n^{2}}{\sum _{i=1}^{n}k_{i}}}}

which is the negative ofn times the reciprocal of the average of theki. This expression is negative when the average is positive. If this is satisfied, then the stationary point maximizes the probability function.

Forcompleteness, a family of distributions is said to be complete if and only ifE(g(T))=0{\displaystyle E(g(T))=0} implies thatPλ(g(T)=0)=1{\displaystyle P_{\lambda }(g(T)=0)=1} for allλ.{\displaystyle \lambda .} If the individualXi{\displaystyle X_{i}} are iidPo(λ),{\displaystyle \mathrm {Po} (\lambda ),} thenT(x)=i=1nXiPo(nλ).{\textstyle T(\mathbf {x} )=\sum _{i=1}^{n}X_{i}\sim \mathrm {Po} (n\lambda ).} Knowing the distribution we want to investigate, it is easy to see that the statistic is complete.

E(g(T))=t=0g(t)(nλ)tenλt!=0{\displaystyle E(g(T))=\sum _{t=0}^{\infty }g(t){\frac {(n\lambda )^{t}e^{-n\lambda }}{t!}}=0}

For this equality to hold,g(t){\displaystyle g(t)} must be 0. This follows from the fact that none of the other terms will be 0 for allt{\displaystyle t} in the sum and for all possible values ofλ.{\displaystyle \lambda .} Hence,E(g(T))=0{\displaystyle E(g(T))=0} for allλ{\displaystyle \lambda } implies thatPλ(g(T)=0)=1,{\displaystyle P_{\lambda }(g(T)=0)=1,} and the statistic has been shown to be complete.

Confidence interval

Theconfidence interval for the mean of a Poisson distribution can be expressed using the relationship between the cumulative distribution functions of the Poisson andchi-squared distributions. The chi-squared distribution is itself closely related to thegamma distribution, and this leads to an alternative expression. Given an observationk from a Poisson distribution with meanμ, a confidence interval forμ with confidence level1 –α is

12χ2(α/2;2k)μ12χ2(1α/2;2k+2),{\displaystyle {\tfrac {1}{2}}\chi ^{2}(\alpha /2;2k)\leq \mu \leq {\tfrac {1}{2}}\chi ^{2}(1-\alpha /2;2k+2),}

or equivalently,

F1(α/2;k,1)μF1(1α/2;k+1,1),{\displaystyle F^{-1}(\alpha /2;k,1)\leq \mu \leq F^{-1}(1-\alpha /2;k+1,1),}

whereχ2(p;n){\displaystyle \chi ^{2}(p;n)} is thequantile function (corresponding to a lower tail areap) of the chi-squared distribution withn degrees of freedom andF1(p;n,1){\displaystyle F^{-1}(p;n,1)} is the quantile function of agamma distribution with shape parameter n and scale parameter 1.[8]: 176-178 [50] This interval is 'exact' in the sense that itscoverage probability is never less than the nominal1 –α.

When quantiles of the gamma distribution are not available, an accurate approximation to this exact interval has been proposed (based on theWilson–Hilferty transformation):[51]k(119kzα/23k)3μ(k+1)(119(k+1)+zα/23k+1)3,{\displaystyle k\left(1-{\frac {1}{9k}}-{\frac {z_{\alpha /2}}{3{\sqrt {k}}}}\right)^{3}\leq \mu \leq (k+1)\left(1-{\frac {1}{9(k+1)}}+{\frac {z_{\alpha /2}}{3{\sqrt {k+1}}}}\right)^{3},}wherezα/2{\displaystyle z_{\alpha /2}} denotes thestandard normal deviate with upper tail areaα / 2.

For application of these formulae in the same context as above (given a sample ofn measured valueski each drawn from a Poisson distribution with meanλ), one would set

k=i=1nki,{\displaystyle k=\sum _{i=1}^{n}k_{i},}calculate an interval forμ =, and then derive the interval forλ.

Bayesian inference

InBayesian inference, theconjugate prior for the rate parameterλ of the Poisson distribution is thegamma distribution.[52] Let

λGamma(α,β){\displaystyle \lambda \sim \mathrm {Gamma} (\alpha ,\beta )}

denote thatλ is distributed according to the gammadensityg parameterized in terms of ashape parameterα and an inversescale parameterβ:

g(λα,β)=βαΓ(α)λα1eβλ for λ>0.{\displaystyle g(\lambda \mid \alpha ,\beta )={\frac {\beta ^{\alpha }}{\Gamma (\alpha )}}\;\lambda ^{\alpha -1}\;e^{-\beta \,\lambda }\qquad {\text{ for }}\lambda >0\,\!.}

Then, given the same sample ofn measured valueskias before, and a prior ofGamma(α,β), the posterior distribution is

λGamma(α+i=1nki,β+n).{\displaystyle \lambda \sim \mathrm {Gamma} {\left(\alpha +\sum _{i=1}^{n}k_{i},\beta +n\right)}.}

Note that the posterior mean is linear and is given byE[λk1,,kn]=α+i=1nkiβ+n.{\displaystyle E[\lambda \mid k_{1},\ldots ,k_{n}]={\frac {\alpha +\sum _{i=1}^{n}k_{i}}{\beta +n}}.}It can be shown that gamma distribution is the only prior that induces linearity of the conditional mean. Moreover, a converse result exists which states that if the conditional mean is close to a linear function in theL2{\displaystyle L_{2}} distance than the prior distribution ofλ must be close to gamma distribution inLevy distance.[53]

The posterior meanE[λ] approaches the maximum likelihood estimateλ^MLE{\displaystyle {\widehat {\lambda }}_{\mathrm {MLE} }} in the limit asα0,β0,{\displaystyle \alpha \to 0,\beta \to 0,} which follows immediately from the general expression of the mean of thegamma distribution.

Theposterior predictive distribution for a single additional observation is anegative binomial distribution,[54]: 53  sometimes called a gamma–Poisson distribution.

Simultaneous estimation of multiple Poisson means

SupposeX1,X2,,Xp{\displaystyle X_{1},X_{2},\dots ,X_{p}} is a set of independent random variables from a set ofp{\displaystyle p} Poisson distributions, each with a parameterλi,{\displaystyle \lambda _{i},}i=1,,p,{\displaystyle i=1,\dots ,p,} and we would like to estimate these parameters. Then, Clevenson and Zidek show that under the normalized squared error lossL(λ,λ^)=i=1pλi1(λ^iλi)2,{\textstyle L(\lambda ,{\hat {\lambda }})=\sum _{i=1}^{p}\lambda _{i}^{-1}({\hat {\lambda }}_{i}-\lambda _{i})^{2},} whenp>1,{\displaystyle p>1,} then, similar as inStein's example for the Normal means, the MLE estimatorλ^i=Xi{\displaystyle {\hat {\lambda }}_{i}=X_{i}} isinadmissible.[55]

In this case, a family ofminimax estimators is given for any0<c2(p1){\displaystyle 0<c\leq 2(p-1)} andb(p2+p1){\displaystyle b\geq (p-2+p^{-1})} as[56]λ^i=(1cb+i=1pXi)Xi,i=1,,p.{\displaystyle {\hat {\lambda }}_{i}=\left(1-{\frac {c}{b+\sum _{i=1}^{p}X_{i}}}\right)X_{i},\qquad i=1,\dots ,p.}

Occurrence and applications

icon
This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Poisson distribution" – news ·newspapers ·books ·scholar ·JSTOR
(December 2019) (Learn how and when to remove this message)

Some applications of the Poisson distribution tocount data (number of events):[57]

More examples of counting events that may be modelled as Poisson processes include:

Inprobabilistic number theory,Gallagher showed in 1976 that, if a certain version of the unprovedprime r-tuple conjecture holds,[70] then the counts ofprime numbers in short intervals would obey a Poisson distribution.[71]

Law of rare events

Main article:Poisson limit theorem
Comparison of the Poisson distribution (black lines) and thebinomial distribution withn = 10 (red circles),n = 20 (blue circles),n = 1000 (green circles). All distributions have a mean of 5. The horizontal axis shows the number of events k. Asn gets larger, the Poisson distribution becomes an increasingly better approximation for the binomial distribution with the same mean.

The rate of an event is related to the probability of an event occurring in some small subinterval (of time, space or otherwise). In the case of the Poisson distribution, one assumes that there exists a small enough subinterval for which the probability of an event occurring twice is "negligible". With this assumption one can derive the Poisson distribution from the binomial one, given only the information of expected number of total events in the whole interval.

Let the total number of events in the whole interval be denoted byλ.{\displaystyle \lambda .} Divide the whole interval inton{\displaystyle n} subintervalsI1,,In{\displaystyle I_{1},\dots ,I_{n}} of equal size, such thatn>λ{\displaystyle n>\lambda } (since we are interested in only very small portions of the interval this assumption is meaningful). This means that the expected number of events in each of then subintervals is equal toλ/n.{\displaystyle \lambda /n.}

Now we assume that the occurrence of an event in the whole interval can be seen as a sequence ofnBernoulli trials, where thei{\displaystyle i}-thBernoulli trial corresponds to looking whether an event happens at the subintervalIi{\displaystyle I_{i}} with probabilityλ/n.{\displaystyle \lambda /n.} The expected number of total events inn{\displaystyle n} such trials would beλ,{\displaystyle \lambda ,} the expected number of total events in the whole interval. Hence for each subdivision of the interval we have approximated the occurrence of the event as a Bernoulli process of the formB(n,λ/n).{\displaystyle {\textrm {B}}(n,\lambda /n).} As we have noted before we want to consider only very small subintervals. Therefore, we take the limit asn{\displaystyle n} goes to infinity.

In this case thebinomial distribution converges to what is known as the Poisson distribution by thePoisson limit theorem.

In several of the above examples — such as the number of mutations in a given sequence of DNA — the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using thebinomial distribution, that isXB(n,p).{\displaystyle X\sim {\textrm {B}}(n,p).}

In such casesn is very large andp is very small (and so the expectationnp is of intermediate magnitude). Then the distribution may be approximated by the less cumbersome Poisson distributionXPois(np).{\displaystyle X\sim {\textrm {Pois}}(np).}

This approximation is sometimes known as thelaw of rare events,[72]: 5  since each of then individualBernoulli events rarely occurs.

The name "law of rare events" may be misleading because the total count of success events in a Poisson process need not be rare if the parameternp is not small. For example, the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution with the events appearing frequent to the operator, but they are rare from the point of view of the average member of the population who is very unlikely to make a call to that switchboard in that hour.

The variance of the binomial distribution is1 −p times that of the Poisson distribution, so almost equal whenp is very small.

The wordlaw is sometimes used as a synonym ofprobability distribution, andconvergence in law meansconvergence in distribution. Accordingly, the Poisson distribution is sometimes called the "law of small numbers" because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen.The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898.[12][73]

Poisson point process

Main article:Poisson point process

The Poisson distribution arises as the number of points of aPoisson point process located in some finite region. More specifically, ifD is some region space, for example Euclidean spaceRd, for which|D|, the area, volume or, more generally, the Lebesgue measure of the region is finite, and ifN(D) denotes the number of points inD, then

P(N(D)=k)=(λ|D|)keλ|D|k!.{\displaystyle P(N(D)=k)={\frac {(\lambda |D|)^{k}e^{-\lambda |D|}}{k!}}.}

Poisson regression and negative binomial regression

Poisson regression andnegative binomial regression are useful for analyses where the dependent (response) variable is the count(0, 1, 2, ... ) of the number of events or occurrences in an interval.

Biology

TheLuria–Delbrück experiment tested against the hypothesis of Lamarckian evolution, which should result in a Poisson distribution.

Katz and Miledi measured themembrane potential with and without the presence ofacetylcholine (ACh).[74] When ACh is present,ion channels on the membrane would be open randomly at a small fraction of the time. As there are a large number of ion channels each open for a small fraction of the time, the total number of ion channels open at any moment is Poisson distributed. When ACh is not present, effectively no ion channels are open. The membrane potential isV=NopenVion+V0+Vnoise{\displaystyle V=N_{\text{open}}V_{\text{ion}}+V_{0}+V_{\text{noise}}}. Subtracting the effect of noise, Katz and Miledi found the mean and variance of membrane potential to be8.5×103V{\displaystyle 8.5\times 10^{-3}\;\mathrm {V} } and(29.2×106V)2{\displaystyle (29.2\times 10^{-6}\;\mathrm {V} )^{2}} respectively, givingVion=107V{\displaystyle V_{\text{ion}}=10^{-7}\;\mathrm {V} }. (pp. 94-95[75])

During each cellular replication event, the number of mutations is roughly Poisson distributed.[76] For example, the HIV virus has 10,000 base pairs, and has a mutation rate of about 1 per 30,000 base pairs, meaning the number of mutations per replication event is distributed asPois(1/3){\displaystyle \mathrm {Pois} (1/3)}. (p. 64[75])

Other applications in science

In a Poisson process, the number of observed occurrences fluctuates about its meanλ with astandard deviationσk=λ.{\displaystyle \sigma _{k}={\sqrt {\lambda }}.} These fluctuations are denoted asPoisson noise or (particularly in electronics) asshot noise.

The correlation of the mean and standard deviation in counting independent discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence,even if that contribution is too small to be detected directly. For example, the chargee on an electron can be estimated by correlating the magnitude of anelectric current with itsshot noise. IfN electrons pass a point in a given timet on the average, themeancurrent isI=eN/t{\displaystyle I=eN/t}; since the current fluctuations should be of the orderσI=eN/t{\displaystyle \sigma _{I}=e{\sqrt {N}}/t} (i.e., the standard deviation of thePoisson process), the chargee{\displaystyle e} can be estimated from the ratiotσI2/I.{\displaystyle t\sigma _{I}^{2}/I.}[citation needed]

An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reducedsilver grains, not to the individual grains themselves. Bycorrelating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided).[citation needed]

Incausal set theory the discrete elements of spacetime follow a Poisson distribution in the volume.

The Poisson distribution also appears inquantum mechanics, especiallyquantum optics. Namely, for aquantum harmonic oscillator system in acoherent state, the probability of measuring a particular energy level has a Poisson distribution.

Computational methods

The Poisson distribution poses two different tasks for dedicated software libraries:evaluating the distributionP(k;λ){\displaystyle P(k;\lambda )}, anddrawing random numbers according to that distribution.

Evaluating the Poisson distribution

ComputingP(k;λ){\displaystyle P(k;\lambda )} for givenk{\displaystyle k} andλ{\displaystyle \lambda } is a trivial task that can be accomplished by using the standard definition ofP(k;λ){\displaystyle P(k;\lambda )} in terms of exponential, power, and factorial functions. However, the conventional definition of the Poisson distribution contains two terms that can easily overflow on computers:λk andk!. The fraction ofλk tok! can also produce a rounding error that is very large compared toeλ, and therefore give an erroneous result. For numerical stability the Poisson probability mass function should therefore be evaluated asf(k;λ)=exp[klnλλlnΓ(k+1)],{\displaystyle \!f(k;\lambda )=\exp \left[k\ln \lambda -\lambda -\ln \Gamma (k+1)\right],}which is mathematically equivalent but numerically stable. The natural logarithm of theGamma function can be obtained using thelgamma function in theC standard library (C99 version) orR, thegammaln function inMATLAB orSciPy, or thelog_gamma function inFortran 2008 and later.

Some computing languages provide built-in functions to evaluate the Poisson distribution, namely

Random variate generation

Further information:Non-uniform random variate generation

The less trivial task is to draw integerrandom variate from the Poisson distribution with givenλ.{\displaystyle \lambda .}

Solutions are provided by:

A simple algorithm to generate random Poisson-distributed numbers (pseudo-random number sampling) has been given byKnuth:[79]: 137-138 

algorithmpoisson random number (Knuth):init:Let L ←eλ, k ← 0 and p ← 1.do:        k ← k + 1.        Generate uniform random number u in [0,1] andlet p ← p × u.while p > L.return k − 1.

The complexity is linear in the returned valuek, which isλ on average. There are many other algorithms to improve this. Some are given in Ahrens & Dieter, see§ References below.

For large values ofλ, the value ofL =eλ may be so small that it is hard to represent. This can be solved by a change to the algorithm which uses an additional parameter STEP such thate−STEP does not underflow:[citation needed]

algorithmpoisson random number (Junhao, based on Knuth):init:LetλLeft ←λ, k ← 0 and p ← 1.do:        k ← k + 1.        Generate uniform random number u in (0,1) andlet p ← p × u.while p < 1 andλLeft > 0:ifλLeft > STEP:                p ← p ×eSTEPλLeft ←λLeft − STEPelse:                p ← p ×eλLeftλLeft ← 0while p > 1.return k − 1.

The choice of STEP depends on the threshold of overflow. For double precision floating point format the threshold is neare700, so 500 should be a safeSTEP.

Other solutions for large values ofλ includerejection sampling and using Gaussian approximation.

Inverse transform sampling is simple and efficient for small values ofλ, and requires only one uniform random numberu per sample. Cumulative probabilities are examined in turn until one exceedsu.

algorithmPoisson generator based upon the inversion by sequential search:[80]: 505 init:Let x ← 0, p ←eλ, s ← p.        Generate uniform random number u in [0,1].while u > sdo:        x ← x + 1.        p ← p ×λ / x.        s ← s + p.return x.

See also

References

Citations

  1. ^abHaight, Frank A. (1967).Handbook of the Poisson Distribution. New York, NY, US: John Wiley & Sons.ISBN 978-0-471-33932-8.
  2. ^abYates, Roy D.; Goodman, David J. (2014).Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers (2nd ed.). Hoboken, NJ: Wiley.ISBN 978-0-471-45259-1.
  3. ^Ross, Sheldon M. (2014).Introduction to Probability Models (11th ed.). Academic Press.
  4. ^Poisson, Siméon D. (1837).Probabilité des jugements en matière criminelle et en matière civile, précédées des règles générales du calcul des probabilités [Research on the Probability of Judgments in Criminal and Civil Matters] (in French). Paris, France: Bachelier.
  5. ^de Moivre, Abraham (1711)."De mensura sortis, seu, de probabilitate eventuum in ludis a casu fortuito pendentibus" [On the Measurement of Chance, or, on the Probability of Events in Games Depending Upon Fortuitous Chance].Philosophical Transactions of the Royal Society (in Latin).27 (329):213–264.doi:10.1098/rstl.1710.0018.
  6. ^de Moivre, Abraham.The Doctrine of Chances: Or, A Method of Calculating the Probability of Events in Play. London, Great Britain: W. Pearson.ISBN 9780598843753.
  7. ^de Moivre, Abraham. "Of the Laws of Chance". In Motte, Benjamin (ed.).The Philosophical Transactions from the Year MDCC (where Mr. Lowthorp Ends) to the Year MDCCXX. Abridg'd, and Dispos'd Under General Heads (in Latin). Vol. I. London, Great Britain: R. Wilkin, R. Robinson, S. Ballard, W. and J. Innys, and J. Osborn. pp. 190–219.
  8. ^abcdefghiJohnson, Norman L.;Kemp, Adrienne W.; Kotz, Samuel (2005). "Poisson Distribution".Univariate Discrete Distributions (3rd ed.). New York, NY, US: John Wiley & Sons, Inc. pp. 156–207.doi:10.1002/0471715816.ISBN 978-0-471-27246-5.
  9. ^Stigler, Stephen M. (1982). "Poisson on the Poisson Distribution".Statistics & Probability Letters.1 (1):33–35.doi:10.1016/0167-7152(82)90010-4.
  10. ^Hald, Anders; de Moivre, Abraham; McClintock, Bruce (1984). "A. de Moivre: 'De Mensura Sortis' or 'On the Measurement of Chance'".International Statistical Review / Revue Internationale de Statistique.52 (3):229–262.doi:10.2307/1403045.JSTOR 1403045.
  11. ^Newcomb, Simon (1860)."Notes on the theory of probabilities".The Mathematical Monthly.2 (4):134–140.
  12. ^abcvon Bortkiewitsch, Ladislaus (1898).Das Gesetz der kleinen Zahlen [The law of small numbers] (in German). Leipzig, Germany: B.G. Teubner. pp. 1,23–25.
    Onpage 1, Bortkiewicz presents the Poisson distribution.
    Onpages 23–25, Bortkiewitsch presents his analysis of "4. Beispiel: Die durch Schlag eines Pferdes im preußischen Heere Getöteten." [4. Example: Those killed in the Prussian army by a horse's kick.]
  13. ^For the proof, see:Proof wiki: expectation andProof wiki: variance
  14. ^Kardar, Mehran (2007).Statistical Physics of Particles.Cambridge University Press. p. 42.ISBN 978-0-521-87342-0.OCLC 860391091.
  15. ^Dekking, Frederik Michel; Kraaikamp, Cornelis; Lopuhaä, Hendrik Paul; Meester, Ludolf Erwin (2005).A Modern Introduction to Probability and Statistics. Springer Texts in Statistics. p. 167.doi:10.1007/1-84628-168-7.ISBN 978-1-85233-896-1.
  16. ^Pitman, Jim (1993).Probability. Springer Texts in Statistics. New York Dordrecht Heidelberg London: Springer. p. 118.ISBN 978-0-387-94594-1.
  17. ^Hsu, Hwei P. (1996).Theory and Problems of Probability, Random Variables, and Random Processes. Schaum's Outline Series. New York: McGraw Hill. p. 68.ISBN 0-07-030644-3.
  18. ^Arfken, George B.; Weber, Hans J. (2005).Mathematical Methods for Physicists Sixth Edition. Elsevier Academic Press. p. 1131.ISBN 0-12-059876-0.
  19. ^Cowan, Glen (2009)."Derivation of the Poisson distribution"(PDF).
  20. ^Joyce, D. (2014)."The Poisson process"(PDF).
  21. ^Ugarte, M.D.;Militino, A.F.; Arnholt, A.T. (2016).Probability and Statistics with R (2nd ed.). Boca Raton, FL, US: CRC Press.ISBN 978-1-4665-0439-4.
  22. ^Helske, Jouni (2017)."KFAS: Exponential Family State Space Models in R".Journal of Statistical Software.78 (10).arXiv:1612.01907.doi:10.18637/jss.v078.i10.S2CID 14379617.
  23. ^Choi, Kwok P. (1994)."On the medians of gamma distributions and an equation of Ramanujan".Proceedings of the American Mathematical Society.121 (1):245–251.doi:10.2307/2160389.JSTOR 2160389.
  24. ^Riordan, John (1937)."Moment Recurrence Relations for Binomial, Poisson and Hypergeometric Frequency Distributions"(PDF).Annals of Mathematical Statistics.8 (2):103–111.doi:10.1214/aoms/1177732430.JSTOR 2957598.
  25. ^D. Ahle, Thomas (2022). "Sharp and simple bounds for the raw moments of the Binomial and Poisson distributions".Statistics & Probability Letters.182 109306.arXiv:2103.17027.doi:10.1016/j.spl.2021.109306.
  26. ^Lehmann, Erich Leo (1986).Testing Statistical Hypotheses (2nd ed.). New York, NJ, US: Springer Verlag.ISBN 978-0-387-94919-2.
  27. ^Raikov, Dmitry (1937). "On the decomposition of Poisson laws".Comptes Rendus de l'Académie des Sciences de l'URSS.14:9–11.
  28. ^von Mises, Richard.Mathematical Theory of Probability and Statistics. New York: Academic Press.doi:10.1016/C2013-0-12460-9.ISBN 978-1-4832-3213-3.
  29. ^Harremoes, P. (July 2001). "Binomial and Poisson distributions as maximum entropy distributions".IEEE Transactions on Information Theory.47 (5):2039–2041.doi:10.1109/18.930936.S2CID 16171405.
  30. ^Laha, Radha G.; Rohatgi, Vijay K. (1979).Probability Theory. New York, NJ, US: John Wiley & Sons.ISBN 978-0-471-03262-5.
  31. ^Mitzenmacher, Michael (2017).Probability and computing: Randomization and probabilistic techniques in algorithms and data analysis. Eli Upfal (2nd ed.). Exercise 5.14.ISBN 978-1-107-15488-9.OCLC 960841613.
  32. ^abMitzenmacher, Michael;Upfal, Eli (2005).Probability and Computing: Randomized Algorithms and Probabilistic Analysis. Cambridge, UK: Cambridge University Press.ISBN 978-0-521-83540-4.
  33. ^Short, Michael (2013)."Improved Inequalities for the Poisson and Binomial Distribution and Upper Tail Quantile Functions".ISRN Probability and Statistics.2013. Corollary 6.doi:10.1155/2013/412958.
  34. ^Short, Michael (2013)."Improved Inequalities for the Poisson and Binomial Distribution and Upper Tail Quantile Functions".ISRN Probability and Statistics.2013. Theorem 2.doi:10.1155/2013/412958.
  35. ^Kamath, Govinda M.; Şaşoğlu, Eren; Tse, David (14–19 June 2015).Optimal haplotype assembly from high-throughput mate-pair reads. 2015 IEEE International Symposium on Information Theory (ISIT). Hong Kong, China. pp. 914–918.arXiv:1502.01975.doi:10.1109/ISIT.2015.7282588.S2CID 128634.
  36. ^Prins, Jack (2012)."6.3.3.1. Counts Control Charts".e-Handbook of Statistical Methods. NIST/SEMATECH. Retrieved20 September 2019.
  37. ^Feller, William.An Introduction to Probability Theory and its Applications.
  38. ^Zhang, Huiming; Liu, Yunxiao; Li, Bo (2014). "Notes on discrete compound Poisson model with applications to risk theory".Insurance: Mathematics and Economics.59:325–336.doi:10.1016/j.insmatheco.2014.09.012.
  39. ^Zhang, Huiming; Li, Bo (2016). "Characterizations of discrete compound Poisson distributions".Communications in Statistics - Theory and Methods.45 (22):6789–6802.doi:10.1080/03610926.2014.901375.S2CID 125475756.
  40. ^McCullagh, Peter;Nelder, John (1989).Generalized Linear Models. Monographs on Statistics and Applied Probability. Vol. 37. London, UK: Chapman and Hall.ISBN 978-0-412-31760-6.
  41. ^Anscombe, Francis J. (1948). "The transformation of Poisson, binomial and negative binomial data".Biometrika.35 (3–4):246–254.doi:10.1093/biomet/35.3-4.246.JSTOR 2332343.
  42. ^Ross, Sheldon M. (2010).Introduction to Probability Models (10th ed.). Boston, MA: Academic Press.ISBN 978-0-12-375686-2.
  43. ^"1.7.7 – Relationship between the Multinomial and Poisson | STAT 504". Archived fromthe original on 6 August 2019. Retrieved6 August 2019.
  44. ^Loukas, Sotirios; Kemp, C. David (1986). "The Index of Dispersion Test for the Bivariate Poisson Distribution".Biometrics.42 (4):941–948.doi:10.2307/2530708.JSTOR 2530708.
  45. ^Free Random Variables by D. Voiculescu, K. Dykema, A. Nica, CRM Monograph Series, American Mathematical Society, Providence RI, 1992
  46. ^Alexandru Nica, Roland Speicher:Lectures on the Combinatorics of Free Probability. London Mathematical Society Lecture Note Series, Vol. 335, Cambridge University Press, 2006.
  47. ^Lectures on the Combinatorics of Free Probability by A. Nica and R. Speicher, pp. 203–204, Cambridge Univ. Press 2006
  48. ^Paszek, Ewa."Maximum likelihood estimation – examples".cnx.org.
  49. ^Van Trees, Harry L. (2013).Detection estimation and modulation theory. Kristine L. Bell, Zhi Tian (Second ed.).ISBN 978-1-299-66515-6.OCLC 851161356.
  50. ^Garwood, Frank (1936). "Fiducial Limits for the Poisson Distribution".Biometrika.28 (3/4):437–442.doi:10.1093/biomet/28.3-4.437.JSTOR 2333958.
  51. ^Breslow, Norman E.;Day, Nick E. (1987).Statistical Methods in Cancer Research. Vol. 2 — The Design and Analysis of Cohort Studies. Lyon, France:International Agency for Research on Cancer.ISBN 978-92-832-0182-3. Archived fromthe original on 8 August 2018. Retrieved11 March 2012.
  52. ^Fink, Daniel (1997).A Compendium of Conjugate Priors.
  53. ^Dytso, Alex; Poor, H. Vincent (2020)."Estimation in Poisson noise: Properties of the conditional mean estimator".IEEE Transactions on Information Theory.66 (7):4304–4323.arXiv:1911.03744.doi:10.1109/TIT.2020.2979978.S2CID 207853178.
  54. ^Gelman; Carlin, John B.; Stern, Hal S.; Rubin, Donald B. (2003).Bayesian Data Analysis (2nd ed.). Boca Raton, FL, US: Chapman & Hall/CRC.ISBN 1-58488-388-X.
  55. ^Clevenson, M. Lawrence; Zidek, James V. (1975). "Simultaneous estimation of the means of independent Poisson laws".Journal of the American Statistical Association.70 (351):698–705.doi:10.1080/01621459.1975.10482497.JSTOR 2285958.
  56. ^Berger, James O. (1985).Statistical Decision Theory and Bayesian Analysis. Springer Series in Statistics (2nd ed.). New York, NY: Springer-Verlag.Bibcode:1985sdtb.book.....B.doi:10.1007/978-1-4757-4286-2.ISBN 978-0-387-96098-2.
  57. ^Rasch, Georg (1963).The Poisson Process as a Model for a Diversity of Behavioural Phenomena(PDF). 17th International Congress of Psychology. Vol. 2. Washington, DC: American Psychological Association.doi:10.1037/e685262012-108.
  58. ^Flory, Paul J. (1940). "Molecular Size Distribution in Ethylene Oxide Polymers".Journal of the American Chemical Society.62 (6):1561–1565.Bibcode:1940JAChS..62.1561F.doi:10.1021/ja01863a066.
  59. ^Dwyer, Barry (23 March 2016).Systems Analysis and Synthesis: Bridging Computer Science and Information Technology. Morgan Kaufmann.ISBN 978-0-12-805449-9.Similarly, if candidates arrive at an enrolment centeruniformly, the times between their arrivals will be distributedexponentially, and the number of candidates arriving each hour will follow aPoisson distribution.
  60. ^Lomnitz, Cinna (1994).Fundamentals of Earthquake Prediction. New York, NY: John Wiley & Sons.ISBN 0-471-57419-8.OCLC 647404423.
  61. ^"Poisson Distribution and Radiological Measurement".www.hko.gov.hk. Retrieved30 September 2025.The actual number of decays over a period of time is generally described by the Poisson distribution.
  62. ^a student (1907)."On the error of counting with a haemacytometer".Biometrika.5 (3):351–360.doi:10.2307/2331633.JSTOR 2331633.
  63. ^Boland, Philip J. (1984). "A biographical glimpse of William Sealy Gosset".The American Statistician.38 (3):179–183.doi:10.1080/00031305.1984.10483195.JSTOR 2683648.
  64. ^Erlang, Agner K. (1909). "Sandsynlighedsregning og Telefonsamtaler" [Probability Calculation and Telephone Conversations].Nyt Tidsskrift for Matematik (in Danish).20 (B):33–39.JSTOR 24528622.
  65. ^Hornby, Dave (2014)."Football Prediction Model: Poisson Distribution". Sports Betting Online. Retrieved19 September 2014.
  66. ^Campbell, Michael J.; Jacques, Richard M. (13 February 2023).Statistics at Square Two. John Wiley & Sons.ISBN 978-1-119-40136-0.The expected values are given by the Poisson model.
  67. ^Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu (2016). "Do bacterial cell numbers follow a theoretical Poisson distribution? Comparison of experimentally obtained numbers of single cells with random number generation via computer simulation".Food Microbiology.60:49–53.doi:10.1016/j.fm.2016.05.019.PMID 27554145.
  68. ^The Senses: A Comprehensive Reference. Academic Press. 30 September 2020.ISBN 978-0-12-805409-3.The division of light into discrete photons means that nominally constant light sources – e.g. a light bulb or a reflecting object in a scene – will produce visual inputs that vary randomly over time. These variations are described by Poisson statistics.
  69. ^Clarke, R. D. (1946)."An application of the Poisson distribution"(PDF).Journal of the Institute of Actuaries.72 (3): 481.doi:10.1017/S0020268100035435.
  70. ^Hardy, Godfrey H.;Littlewood, John E. (1923)."On some problems of "partitio numerorum" III: On the expression of a number as a sum of primes".Acta Mathematica.44:1–70.doi:10.1007/BF02403921.
  71. ^Gallagher, Patrick X. (1976). "On the distribution of primes in short intervals".Mathematika.23 (1):4–9.doi:10.1112/s0025579300016442.
  72. ^Cameron, A. Colin; Trivedi, Pravin K. (1998).Regression Analysis of Count Data. Cambridge, UK: Cambridge University Press.ISBN 978-0-521-63567-7.
  73. ^Edgeworth, F.Y. (1913)."On the use of the theory of probabilities in statistics relating to society".Journal of the Royal Statistical Society.76 (2):165–193.doi:10.2307/2340091.JSTOR 2340091.
  74. ^Katz, B.; Miledi, R. (August 1972)."The statistical nature of the acetylcholine potential and its molecular components".The Journal of Physiology.224 (3):665–699.doi:10.1113/jphysiol.1972.sp009918.ISSN 0022-3751.PMC 1331515.PMID 5071933.
  75. ^abNelson, Philip Charles; Bromberg, Sarina; Hermundstad, Ann; Prentice, Jason (2015).Physical models of living systems. New York, NY: W.H. Freeman & Company, a Macmillan Education Imprint.ISBN 978-1-4641-4029-7.OCLC 891121698.
  76. ^Foster, Patricia L. (1 January 2006), "Methods for Determining Spontaneous Mutation Rates",DNA Repair, Part B, Methods in Enzymology, vol. 409, Academic Press, pp. 195–213,doi:10.1016/S0076-6879(05)09012-9,ISBN 978-0-12-182814-1,PMC 2041832,PMID 16793403
  77. ^"Wolfram Language: PoissonDistribution reference page".wolfram.com. Retrieved8 April 2016.
  78. ^"Wolfram Language: MultivariatePoissonDistribution reference page".wolfram.com. Retrieved8 April 2016.
  79. ^Knuth, Donald Ervin (1997).Seminumerical Algorithms.The Art of Computer Programming. Vol. 2 (3rd ed.).Addison Wesley.ISBN 978-0-201-89684-8.
  80. ^Devroye, Luc (1986)."Discrete Univariate Distributions"(PDF).Non-Uniform Random Variate Generation. New York, NY: Springer-Verlag. pp. 485–553.doi:10.1007/978-1-4613-8643-8_10.ISBN 978-1-4613-8645-2.

Sources

Discrete
univariate
with finite
support
with infinite
support
Continuous
univariate
supported on a
bounded interval
supported on a
semi-infinite
interval
supported
on the whole
real line
with support
whose type varies
Mixed
univariate
continuous-
discrete
Multivariate
(joint)
Directional
Degenerate
andsingular
Degenerate
Dirac delta function
Singular
Cantor
Families
International
National
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=Poisson_distribution&oldid=1322998634"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp