Inmathematics, specifically in the theory ofgeneralized functions, thelimit of a sequence of distributions is the distribution that sequence approaches. The distance, suitably quantified, to the limiting distribution can be made arbitrarily small by selecting a distribution sufficiently far along the sequence. This notion generalizes alimit of a sequence of functions; a limit as a distribution may exist when a limit of functions does not.
The notion is a part of distributional calculus, a generalized form ofcalculus that is based on the notion of distributions, as opposed to classical calculus, which is based on the narrower concept offunctions.
Given a sequence of distributions, its limit is the distribution given by
for each test function, provided that distribution exists. The existence of the limit means that (1) for each, thelimit of the sequence of numbers exists and that (2) the linear functional defined by the above formula is continuous with respect to the topology on the space of test functions.
More generally, as with functions, one can also consider a limit of a family of distributions.