The horizontal axis is the indexk, the number of occurrences.λ is the expected rate of occurrences. The vertical axis is the probability ofk occurrences givenλ. The function is defined only at integer values ofk; the connecting lines are only guides for the eye.
Cumulative distribution function
The horizontal axis is the indexk, the number of occurrences. The CDF is discontinuous at the integers ofk and flat everywhere else because a variable that is Poisson distributed takes on only integer values.
Inprobability theory andstatistics, thePoisson distribution (/ˈpwɑːsɒn/) is adiscrete probability distribution that expresses the probability of a given number ofevents occurring in a fixed interval of time if these events occur with a known constant mean rate andindependently of the time since the last event.[1] It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1 (e.g., number of events in a given area or volume).The Poisson distribution is named afterFrench mathematicianSiméon Denis Poisson. It plays an important role fordiscrete-stable distributions.
Under a Poisson distribution with theexpectation ofλ events in a given interval, the probability ofk events in the same interval is:[2]: 60 For instance, consider a call center which receives an average ofλ = 3 calls per minute at all times of day. If the number of calls received in any two given disjoint time intervals is independent, then the numberk of calls received during any minute has a Poisson probability distribution. Receivingk = 1 to 4 calls then has a probability of about 0.77, while receiving 0 or at least 5 calls has a probability of about 0.23.
A classic example used to motivate the Poisson distribution is the number ofradioactive decay events during a fixed observation period.[3]
History
The distribution was first introduced bySiméon Denis Poisson (1781–1840) and published together with his probability theory in his workRecherches sur la probabilité des jugements en matière criminelle et en matière civile (1837).[4]: 205-207 The work theorized about the number of wrongful convictions in a given country by focusing on certainrandom variablesN that count, among other things, the number of discrete occurrences (sometimes called "events" or "arrivals") that take place during atime-interval of given length. The result had already been given in 1711 byAbraham de Moivre inDe Mensura Sortis seu; de Probabilitate Eventuum in Ludis a Casu Fortuito Pendentibus .[5]: 219 [6]: 14-15 [7]: 193 [8]: 157 This makes it an example ofStigler's law and it has prompted some authors to argue that the Poisson distribution should bear the name of de Moivre.[9][10]
In 1860,Simon Newcomb fitted the Poisson distribution to the number of stars found in a unit of space.[11]A further practical application was made byLadislaus Bortkiewicz in 1898. Bortkiewicz showed that the frequency with which soldiers in the Prussian army were accidentally killed by horse kicks could be well modeled by a Poisson distribution.[12]: 23-25 .
The Poisson distribution can be applied to systems with alarge number of possible events, each of which is rare. The number of such events that occur during a fixed time interval is, under the right circumstances, a random number with a Poisson distribution.
The equation can be adapted if, instead of the average number of events we are given the average rate at which events occur. Then and:[14]
Examples
Chewing gum on a sidewalk. The number of pieces on a single tile is approximately Poisson distributed.
The Poisson distribution may be useful to model events such as:
the number of meteorites greater than one-meter diameter that strike Earth in a year;
the number of laser photons hitting a detector in a particular time interval;
the number of students achieving a low and high mark in an exam; and
locations of defects and dislocations in materials.
Examples of the occurrence of random points in space are: the locations of asteroid impacts with earth (2-dimensional), the locations of imperfections in a material (3-dimensional), and the locations of trees in a forest (2-dimensional).[15]
Assumptions and validity
The Poisson distribution is an appropriate model if the following assumptions are true:
k, a nonnegative integer, is the number of times an event occurs in an interval.
The average rate at which events occur is independent of any occurrences.
Two events cannot occur at exactly the same instant.
If these conditions are true, thenk is a Poisson random variable; the distribution ofk is a Poisson distribution.
The Poisson distribution is also thelimit of abinomial distribution, for which the probability of success for each trial is, where is the expectation and is the number of trials, in the limit that with kept constant[16][17] (seeRelated distributions):The Poisson distribution may also be derived from the differential equations[18][19][20]with initial conditions and evaluated at
Examples of probability for Poisson distributions
On a particular river, overflow floods occur once every 100 years on average. Calculate the probability ofk = 0, 1, 2, 3, 4, 5, or 6 overflow floods in a 100-year interval, assuming the Poisson model is appropriate.
Because the average event rate is one overflow flood per 100 years,λ = 1
k
P(k overflow floods in 100 years)
0
0.368
1
0.368
2
0.184
3
0.061
4
0.015
5
0.003
6
0.0005
The probability for 0 to 6 overflow floods in a 100-year period.
In this example, it is reported that the average number of goals in a World Cup soccer match is approximately 2.5 and the Poisson model is appropriate.[21]Because the average event rate is 2.5 goals per match,λ = 2.5.
k
P(k goals in a World Cup soccer match)
0
0.082
1
0.205
2
0.257
3
0.213
4
0.133
5
0.067
6
0.028
7
0.010
The probability for 0 to 7 goals in a match.
Examples that violate the Poisson assumptions
The number of students who arrive at thestudent union per minute will likely not follow a Poisson distribution, because the rate is not constant (low rate during class time, high rate between class times) and the arrivals of individual students are not independent (students tend to come in groups). The non-constant arrival rate may be modeled as amixed Poisson distribution, and the arrival of groups rather than individual students as acompound Poisson process.
The number of magnitude 5 earthquakes per year in a country may not follow a Poisson distribution if one large earthquake increases the probability of aftershocks of similar magnitude.
Examples in which at least one event is guaranteed are not Poisson distributed; but may be modeled using azero-truncated Poisson distribution.
Count distributions in which the number of intervals with zero events is higher than predicted by a Poisson model may be modeled using azero-inflated model.
Themode of a Poisson-distributed random variable with non-integerλ is equal to which is the largest integer less than or equal to λ. This is also written asfloor(λ). Whenλ is a positive integer, the modes areλ andλ − 1.
All of thecumulants of the Poisson distribution are equal to the expected value λ. Then-thfactorial moment of the Poisson distribution isλn.
Theexpected value of aPoisson process is sometimes decomposed into the product ofintensity andexposure (or more generally expressed as the integral of an "intensity function" over time or space, sometimes described as "exposure").[22]
Median
Bounds for the median () of the distribution are known and aresharp:[23]
If for areindependent, then[26]: 65 A converse isRaikov's theorem, which says that if the sum of two independent random variables is Poisson-distributed, then so are each of those two independent random variables.[27][28]
Maximum entropy
It is amaximum-entropy distribution among the set of generalized binomial distributions with mean and,[29] where a generalized binomial distribution is defined as a distribution of the sum of N independent but not identically distributed Bernoulli variables.
Bounds for the tail probabilities of a Poisson random variable can be derived using aChernoff bound argument.[32]: 97-98
The upper tail probability can be tightened (by a factor of at least two) as follows:[33] where is the Kullback–Leibler divergence of from.
Inequalities that relate the distribution function of a Poisson random variable to theStandard normal distribution function are as follows:[34]
where is the Kullback–Leibler divergence of from and is the Kullback–Leibler divergence of from.
Poisson races
Let and be independent random variables, with then we have that
The upper bound is proved using a standard Chernoff bound.
The lower bound can be proved by noting that is the probability that where which is bounded below by where isrelative entropy (See the entry onbounds on tails of binomial distributions for details). Further noting that and computing a lower bound on the unconditional probability gives the result. More details can be found in the appendix of Kamath et al.[35]
Related distributions
As a Binomial distribution with infinitesimal time-steps
The Poisson distribution can be derived as a limiting case to thebinomial distribution as the number of trials goes to infinity and theexpected number of successes remains fixed — seelaw of rare events below. Therefore, it can be used as an approximation of the binomial distribution ifn is sufficiently large andp is sufficiently small. The Poisson distribution is a good approximation of the binomial distribution ifn is at least 20 andp is smaller than or equal to 0.05, and an excellent approximation ifn ≥ 100 andnp ≤ 10.[36] Letting and be the respectivecumulative density functions of the binomial and Poisson distributions, one has:One derivation of this usesprobability-generating functions.[37] Consider aBernoulli trial (coin-flip) whose probability of one success (or expected number of successes) is within a given interval. Split the interval inton parts, and perform a trial in each subinterval with probability. The probability ofk successes out ofn trials over the entire interval is then given by the binomial distributionwhose generating function is:Taking the limit asn increases to infinity (withx fixed) and applying the product limit definition of theexponential function, this reduces to the generating function of the Poisson distribution:
If and are independent, then the distribution of conditional on is abinomial distribution. Specifically, if then More generally, ifX1,X2, ...,Xn are independent Poisson random variables with parametersλ1,λ2, ...,λn then given it follows that In fact,
If and the distribution of conditional onX =k is abinomial distribution, then the distribution of Y follows a Poisson distribution In fact, if, conditional on follows amultinomial distribution, then each follows an independent Poisson distribution
The Poisson distribution is aspecial case of the discrete compound Poisson distribution (or stuttering Poisson distribution) with only a parameter.[38][39] The discrete compound Poisson distribution can be deduced from the limiting distribution of univariate multinomial distribution. It is also aspecial case of acompound Poisson distribution.
For sufficiently large values ofλ, (sayλ > 1000), thenormal distribution with meanλ and varianceλ (standard deviation) is an excellent approximation to the Poisson distribution. Ifλ is greater than about 10, then the normal distribution is a good approximation if an appropriatecontinuity correction is performed, i.e., ifP(X ≤x), wherex is a non-negative integer, is replaced byP(X ≤x + 0.5).
If for everyt > 0 the number of arrivals in the time interval[0,t] follows the Poisson distribution with meanλt, then the sequence of inter-arrival times are independent and identically distributedexponential random variables having mean 1/λ.[42]: 317–319
The marginal distributions arePoisson(θ1) andPoisson(θ2) and the correlation coefficient is limited to the range
A simple way to generate a bivariate Poisson distribution is to take three independent Poisson distributions with means and then set The probability function of the bivariate Poisson distribution is
In other words, let be random variables so that has value with probability and value 0 with the remaining probability. Assume also that the family arefreely independent. Then the limit as of the law of is given by the Free Poisson law with parameters
This definition is analogous to one of the ways in which the classical Poisson distribution is obtained from a (classical) Poisson process.
The measure associated to the free Poisson law is given by[46]whereand has support
We give values of some important transforms of the free Poisson law; the computation can be found in e.g. in the bookLectures on the Combinatorics of Free Probability by A. Nica and R. Speicher[47]
The R-transform of the free Poisson law is given by
Given a sample ofn measured values fori = 1, ...,n, we wish to estimate the value of the parameterλ of the Poisson population from which the sample was drawn. Themaximum likelihood estimate is[48]
Since each observation has expectationλ, so does the sample mean. Therefore, the maximum likelihood estimate is anunbiased estimator ofλ. It is also an efficient estimator since its variance achieves theCramér–Rao lower bound (CRLB).[49] Hence it isminimum-variance unbiased. Also it can be proven that the sum (and hence the sample mean as it is a one-to-one function of the sum) is a complete and sufficient statistic forλ.
To prove sufficiency we may use thefactorization theorem. Consider partitioning the probability mass function of the joint Poisson distribution for the sample into two parts: one that depends solely on the sample, called, and one that depends on the parameter and the sample only through the function Then is a sufficient statistic for
The first term depends only on. The second term depends on the sample only through Thus, is sufficient.
To find the parameterλ that maximizes the probability function for the Poisson population, we can use the logarithm of the likelihood function:
We take the derivative of with respect toλ and compare it to zero:
Solving forλ gives a stationary point.
Soλ is the average of theki values. Obtaining the sign of the second derivative ofL at the stationary point will determine what kind of extreme valueλ is.
Evaluating the second derivativeat the stationary point gives:
which is the negative ofn times the reciprocal of the average of theki. This expression is negative when the average is positive. If this is satisfied, then the stationary point maximizes the probability function.
Forcompleteness, a family of distributions is said to be complete if and only if implies that for all If the individual are iid then Knowing the distribution we want to investigate, it is easy to see that the statistic is complete.
For this equality to hold, must be 0. This follows from the fact that none of the other terms will be 0 for all in the sum and for all possible values of Hence, for all implies that and the statistic has been shown to be complete.
Confidence interval
Theconfidence interval for the mean of a Poisson distribution can be expressed using the relationship between the cumulative distribution functions of the Poisson andchi-squared distributions. The chi-squared distribution is itself closely related to thegamma distribution, and this leads to an alternative expression. Given an observationk from a Poisson distribution with meanμ, a confidence interval forμ with confidence level1 –α is
or equivalently,
where is thequantile function (corresponding to a lower tail areap) of the chi-squared distribution withn degrees of freedom and is the quantile function of agamma distribution with shape parameter n and scale parameter 1.[8]: 176-178 [50] This interval is 'exact' in the sense that itscoverage probability is never less than the nominal1 –α.
When quantiles of the gamma distribution are not available, an accurate approximation to this exact interval has been proposed (based on theWilson–Hilferty transformation):[51]where denotes thestandard normal deviate with upper tail areaα / 2.
For application of these formulae in the same context as above (given a sample ofn measured valueski each drawn from a Poisson distribution with meanλ), one would set
calculate an interval forμ =nλ, and then derive the interval forλ.
Then, given the same sample ofn measured valueskias before, and a prior ofGamma(α,β), the posterior distribution is
Note that the posterior mean is linear and is given byIt can be shown that gamma distribution is the only prior that induces linearity of the conditional mean. Moreover, a converse result exists which states that if the conditional mean is close to a linear function in the distance than the prior distribution ofλ must be close to gamma distribution inLevy distance.[53]
The posterior meanE[λ] approaches the maximum likelihood estimate in the limit as which follows immediately from the general expression of the mean of thegamma distribution.
Suppose is a set of independent random variables from a set of Poisson distributions, each with a parameter and we would like to estimate these parameters. Then, Clevenson and Zidek show that under the normalized squared error loss when then, similar as inStein's example for the Normal means, the MLE estimator isinadmissible.[55]
biology: the number of mutations on a strand ofDNA per unit length,
management: customers arriving at a counter or call centre,[59]
finance and insurance: number of losses or claims occurring in a given period of time,
seismology: asymptotic Poisson model of risk for large earthquakes,[60]
radioactivity: decays in a given time interval in a radioactive sample,[61]
optics: number of photons emitted in a single laser pulse (a major vulnerability ofquantum key distribution protocols, known as photon number splitting).
More examples of counting events that may be modelled as Poisson processes include:
soldiers killed by horse-kicks each year in each corps in thePrussian cavalry. This example was used in a book byLadislaus Bortkiewicz (1868–1931),[12]: 23-25
Comparison of the Poisson distribution (black lines) and thebinomial distribution withn = 10 (red circles),n = 20 (blue circles),n = 1000 (green circles). All distributions have a mean of 5. The horizontal axis shows the number of events k. Asn gets larger, the Poisson distribution becomes an increasingly better approximation for the binomial distribution with the same mean.
The rate of an event is related to the probability of an event occurring in some small subinterval (of time, space or otherwise). In the case of the Poisson distribution, one assumes that there exists a small enough subinterval for which the probability of an event occurring twice is "negligible". With this assumption one can derive the Poisson distribution from the binomial one, given only the information of expected number of total events in the whole interval.
Let the total number of events in the whole interval be denoted by Divide the whole interval into subintervals of equal size, such that (since we are interested in only very small portions of the interval this assumption is meaningful). This means that the expected number of events in each of then subintervals is equal to
Now we assume that the occurrence of an event in the whole interval can be seen as a sequence ofnBernoulli trials, where the-thBernoulli trial corresponds to looking whether an event happens at the subinterval with probability The expected number of total events in such trials would be the expected number of total events in the whole interval. Hence for each subdivision of the interval we have approximated the occurrence of the event as a Bernoulli process of the form As we have noted before we want to consider only very small subintervals. Therefore, we take the limit as goes to infinity.
In several of the above examples — such as the number of mutations in a given sequence of DNA — the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using thebinomial distribution, that is
In such casesn is very large andp is very small (and so the expectationnp is of intermediate magnitude). Then the distribution may be approximated by the less cumbersome Poisson distribution
This approximation is sometimes known as thelaw of rare events,[72]: 5 since each of then individualBernoulli events rarely occurs.
The name "law of rare events" may be misleading because the total count of success events in a Poisson process need not be rare if the parameternp is not small. For example, the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution with the events appearing frequent to the operator, but they are rare from the point of view of the average member of the population who is very unlikely to make a call to that switchboard in that hour.
The variance of the binomial distribution is1 −p times that of the Poisson distribution, so almost equal whenp is very small.
The wordlaw is sometimes used as a synonym ofprobability distribution, andconvergence in law meansconvergence in distribution. Accordingly, the Poisson distribution is sometimes called the "law of small numbers" because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen.The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898.[12][73]
The Poisson distribution arises as the number of points of aPoisson point process located in some finite region. More specifically, ifD is some region space, for example Euclidean spaceRd, for which|D|, the area, volume or, more generally, the Lebesgue measure of the region is finite, and ifN(D) denotes the number of points inD, then
Poisson regression and negative binomial regression
Poisson regression andnegative binomial regression are useful for analyses where the dependent (response) variable is the count(0, 1, 2, ... ) of the number of events or occurrences in an interval.
Biology
TheLuria–Delbrück experiment tested against the hypothesis of Lamarckian evolution, which should result in a Poisson distribution.
Katz and Miledi measured themembrane potential with and without the presence ofacetylcholine (ACh).[74] When ACh is present,ion channels on the membrane would be open randomly at a small fraction of the time. As there are a large number of ion channels each open for a small fraction of the time, the total number of ion channels open at any moment is Poisson distributed. When ACh is not present, effectively no ion channels are open. The membrane potential is. Subtracting the effect of noise, Katz and Miledi found the mean and variance of membrane potential to be and respectively, giving. (pp. 94-95[75])
During each cellular replication event, the number of mutations is roughly Poisson distributed.[76] For example, the HIV virus has 10,000 base pairs, and has a mutation rate of about 1 per 30,000 base pairs, meaning the number of mutations per replication event is distributed as. (p. 64[75])
Other applications in science
In a Poisson process, the number of observed occurrences fluctuates about its meanλ with astandard deviation These fluctuations are denoted asPoisson noise or (particularly in electronics) asshot noise.
The correlation of the mean and standard deviation in counting independent discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence,even if that contribution is too small to be detected directly. For example, the chargee on an electron can be estimated by correlating the magnitude of anelectric current with itsshot noise. IfN electrons pass a point in a given timet on the average, themeancurrent is; since the current fluctuations should be of the order (i.e., the standard deviation of thePoisson process), the charge can be estimated from the ratio[citation needed]
An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reducedsilver grains, not to the individual grains themselves. Bycorrelating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided).[citation needed]
Incausal set theory the discrete elements of spacetime follow a Poisson distribution in the volume.
The Poisson distribution poses two different tasks for dedicated software libraries:evaluating the distribution, anddrawing random numbers according to that distribution.
Evaluating the Poisson distribution
Computing for given and is a trivial task that can be accomplished by using the standard definition of in terms of exponential, power, and factorial functions. However, the conventional definition of the Poisson distribution contains two terms that can easily overflow on computers:λk andk!. The fraction ofλk tok! can also produce a rounding error that is very large compared toe−λ, and therefore give an erroneous result. For numerical stability the Poisson probability mass function should therefore be evaluated aswhich is mathematically equivalent but numerically stable. The natural logarithm of theGamma function can be obtained using thelgamma function in theC standard library (C99 version) orR, thegammaln function inMATLAB orSciPy, or thelog_gamma function inFortran 2008 and later.
Some computing languages provide built-in functions to evaluate the Poisson distribution, namely
algorithmpoisson random number (Knuth):init:Let L ←e−λ, k ← 0 and p ← 1.do: k ← k + 1. Generate uniform random number u in [0,1] andlet p ← p × u.while p > L.return k − 1.
The complexity is linear in the returned valuek, which isλ on average. There are many other algorithms to improve this. Some are given in Ahrens & Dieter, see§ References below.
For large values ofλ, the value ofL =e−λ may be so small that it is hard to represent. This can be solved by a change to the algorithm which uses an additional parameter STEP such thate−STEP does not underflow:[citation needed]
algorithmpoisson random number (Junhao, based on Knuth):init:LetλLeft ←λ, k ← 0 and p ← 1.do: k ← k + 1. Generate uniform random number u in (0,1) andlet p ← p × u.while p < 1 andλLeft > 0:ifλLeft > STEP: p ← p ×eSTEPλLeft ←λLeft − STEPelse: p ← p ×eλLeftλLeft ← 0while p > 1.return k − 1.
The choice of STEP depends on the threshold of overflow. For double precision floating point format the threshold is neare700, so 500 should be a safeSTEP.
Other solutions for large values ofλ includerejection sampling and using Gaussian approximation.
Inverse transform sampling is simple and efficient for small values ofλ, and requires only one uniform random numberu per sample. Cumulative probabilities are examined in turn until one exceedsu.
algorithmPoisson generator based upon the inversion by sequential search:[80]: 505 init:Let x ← 0, p ←e−λ, s ← p. Generate uniform random number u in [0,1].while u > sdo: x ← x + 1. p ← p ×λ / x. s ← s + p.return x.
^abHaight, Frank A. (1967).Handbook of the Poisson Distribution. New York, NY, US: John Wiley & Sons.ISBN978-0-471-33932-8.
^abYates, Roy D.; Goodman, David J. (2014).Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers (2nd ed.). Hoboken, NJ: Wiley.ISBN978-0-471-45259-1.
^Ross, Sheldon M. (2014).Introduction to Probability Models (11th ed.). Academic Press.
^de Moivre, Abraham. "Of the Laws of Chance". In Motte, Benjamin (ed.).The Philosophical Transactions from the Year MDCC (where Mr. Lowthorp Ends) to the Year MDCCXX. Abridg'd, and Dispos'd Under General Heads (in Latin). Vol. I. London, Great Britain: R. Wilkin, R. Robinson, S. Ballard, W. and J. Innys, and J. Osborn. pp. 190–219.
^Stigler, Stephen M. (1982). "Poisson on the Poisson Distribution".Statistics & Probability Letters.1 (1):33–35.doi:10.1016/0167-7152(82)90010-4.
^Hald, Anders; de Moivre, Abraham; McClintock, Bruce (1984). "A. de Moivre: 'De Mensura Sortis' or 'On the Measurement of Chance'".International Statistical Review / Revue Internationale de Statistique.52 (3):229–262.doi:10.2307/1403045.JSTOR1403045.
^abcvon Bortkiewitsch, Ladislaus (1898).Das Gesetz der kleinen Zahlen [The law of small numbers] (in German). Leipzig, Germany: B.G. Teubner. pp. 1,23–25.
Onpage 1, Bortkiewicz presents the Poisson distribution.
Onpages 23–25, Bortkiewitsch presents his analysis of "4. Beispiel: Die durch Schlag eines Pferdes im preußischen Heere Getöteten." [4. Example: Those killed in the Prussian army by a horse's kick.]
^Pitman, Jim (1993).Probability. Springer Texts in Statistics. New York Dordrecht Heidelberg London: Springer. p. 118.ISBN978-0-387-94594-1.
^Hsu, Hwei P. (1996).Theory and Problems of Probability, Random Variables, and Random Processes. Schaum's Outline Series. New York: McGraw Hill. p. 68.ISBN0-07-030644-3.
^Arfken, George B.; Weber, Hans J. (2005).Mathematical Methods for Physicists Sixth Edition. Elsevier Academic Press. p. 1131.ISBN0-12-059876-0.
^D. Ahle, Thomas (2022). "Sharp and simple bounds for the raw moments of the Binomial and Poisson distributions".Statistics & Probability Letters.182 109306.arXiv:2103.17027.doi:10.1016/j.spl.2021.109306.
^Lehmann, Erich Leo (1986).Testing Statistical Hypotheses (2nd ed.). New York, NJ, US: Springer Verlag.ISBN978-0-387-94919-2.
^Raikov, Dmitry (1937). "On the decomposition of Poisson laws".Comptes Rendus de l'Académie des Sciences de l'URSS.14:9–11.
^Harremoes, P. (July 2001). "Binomial and Poisson distributions as maximum entropy distributions".IEEE Transactions on Information Theory.47 (5):2039–2041.doi:10.1109/18.930936.S2CID16171405.
^Laha, Radha G.; Rohatgi, Vijay K. (1979).Probability Theory. New York, NJ, US: John Wiley & Sons.ISBN978-0-471-03262-5.
^Mitzenmacher, Michael (2017).Probability and computing: Randomization and probabilistic techniques in algorithms and data analysis. Eli Upfal (2nd ed.). Exercise 5.14.ISBN978-1-107-15488-9.OCLC960841613.
^Kamath, Govinda M.; Şaşoğlu, Eren; Tse, David (14–19 June 2015).Optimal haplotype assembly from high-throughput mate-pair reads. 2015 IEEE International Symposium on Information Theory (ISIT). Hong Kong, China. pp. 914–918.arXiv:1502.01975.doi:10.1109/ISIT.2015.7282588.S2CID128634.
^Feller, William.An Introduction to Probability Theory and its Applications.
^Zhang, Huiming; Liu, Yunxiao; Li, Bo (2014). "Notes on discrete compound Poisson model with applications to risk theory".Insurance: Mathematics and Economics.59:325–336.doi:10.1016/j.insmatheco.2014.09.012.
^Zhang, Huiming; Li, Bo (2016). "Characterizations of discrete compound Poisson distributions".Communications in Statistics - Theory and Methods.45 (22):6789–6802.doi:10.1080/03610926.2014.901375.S2CID125475756.
^Loukas, Sotirios; Kemp, C. David (1986). "The Index of Dispersion Test for the Bivariate Poisson Distribution".Biometrics.42 (4):941–948.doi:10.2307/2530708.JSTOR2530708.
^Free Random Variables by D. Voiculescu, K. Dykema, A. Nica, CRM Monograph Series, American Mathematical Society, Providence RI, 1992
^Gelman; Carlin, John B.; Stern, Hal S.; Rubin, Donald B. (2003).Bayesian Data Analysis (2nd ed.). Boca Raton, FL, US: Chapman & Hall/CRC.ISBN1-58488-388-X.
^Clevenson, M. Lawrence; Zidek, James V. (1975). "Simultaneous estimation of the means of independent Poisson laws".Journal of the American Statistical Association.70 (351):698–705.doi:10.1080/01621459.1975.10482497.JSTOR2285958.
^Erlang, Agner K. (1909). "Sandsynlighedsregning og Telefonsamtaler" [Probability Calculation and Telephone Conversations].Nyt Tidsskrift for Matematik (in Danish).20 (B):33–39.JSTOR24528622.
^Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu (2016). "Do bacterial cell numbers follow a theoretical Poisson distribution? Comparison of experimentally obtained numbers of single cells with random number generation via computer simulation".Food Microbiology.60:49–53.doi:10.1016/j.fm.2016.05.019.PMID27554145.
^The Senses: A Comprehensive Reference. Academic Press. 30 September 2020.ISBN978-0-12-805409-3.The division of light into discrete photons means that nominally constant light sources – e.g. a light bulb or a reflecting object in a scene – will produce visual inputs that vary randomly over time. These variations are described by Poisson statistics.
Ahrens, Joachim H.; Dieter, Ulrich (1974). "Computer Methods for Sampling from Gamma, Beta, Poisson and Binomial Distributions".Computing.12 (3):223–246.doi:10.1007/BF02293108.S2CID37484126.