Above: A plot of a series of 100 random numbers concealing asine function. Below: The sine function revealed in acorrelogram produced by autocorrelation.Visual comparison of convolution,cross-correlation, andautocorrelation. For the operations involving functionf, and assuming the height off is 1.0, the value of the result at 5 different points is indicated by the shaded area below each point. Also, the symmetry off is the reason and are identical in this example.
Autocorrelation, sometimes known asserial correlation in thediscrete time case, measures thecorrelation of asignal with a delayed copy of itself. Essentially, it quantifies the similarity between observations of arandom variable at different points in time. The analysis of autocorrelation is a mathematical tool for identifying repeating patterns or hiddenperiodicities within a signal obscured bynoise. Autocorrelation is widely used insignal processing,time domain andtime series analysis to understand the behavior of data over time.
Different fields of study define autocorrelation differently, and not all of these definitions are equivalent. In some fields, the term is used interchangeably withautocovariance.
Instatistics, the autocorrelation of a real or complexrandom process is thePearson correlation between values of the process at different times, as a function of the two times or of the time lag. Let be a random process, and be any point in time ( may be aninteger for adiscrete-time process or areal number for acontinuous-time process). Then is the value (orrealization) produced by a givenrun of the process at time. Suppose that the process hasmean andvariance at time, for each. Then the definition of theautocorrelation function between times and is[1]: 388 [2]: 165
Subtracting the mean before multiplication yields theauto-covariance function between times and:[1]: 392 [2]: 168
Note that this expression is not well defined for all-time series or processes, because the mean may not exist, or the variance may be zero (for a constant process) or infinite (for processes with distribution lacking well-behaved moments, such as certain types ofpower law).
Definition for wide-sense stationary stochastic process
If is awide-sense stationary process then the mean and the variance are time-independent, and further the autocovariance function depends only on the lag between and: the autocovariance depends only on the time-distance between the pair of values but not on their position in time. This further implies that the autocovariance and autocorrelation can be expressed as a function of the time-lag, and that this would be aneven function of the lag. This gives the more familiar forms for theautocorrelation function[1]: 395
It is common practice in some disciplines (e.g. statistics andtime series analysis) to normalize the autocovariance function to get a time-dependentPearson correlation coefficient. However, in other disciplines (e.g. engineering) the normalization is usually dropped and the terms "autocorrelation" and "autocovariance" are used interchangeably.
The definition of the autocorrelation coefficient of a stochastic process is[2]: 169
If the function is well defined, its value must lie in the range, with 1 indicating perfect correlation and −1 indicating perfectanti-correlation.
The normalization is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength ofstatistical dependence, and because the normalization has an effect on the statistical properties of the estimated autocorrelations.
The autocorrelation of a continuous-timewhite noise signal will have a strong peak (represented by aDirac delta function) at and will be exactly for all other.
For real-valued functions, the symmetric autocorrelation function has a real symmetric transform, so theWiener–Khinchin theorem can be re-expressed in terms of real cosines only:
The (potentially time-dependent)autocorrelation matrix (also called second moment) of a (potentially time-dependent)random vector is an matrix containing as elements the autocorrelations of all pairs of elements of the random vector. The autocorrelation matrix is used in variousdigital signal processing algorithms.
The autocorrelation matrix is apositive semidefinite matrix,[3]: 190 i.e. for a real random vector, and respectively in case of a complex random vector.
All eigenvalues of the autocorrelation matrix are real and non-negative.
Theauto-covariance matrix is related to the autocorrelation matrix as follows:Respectively for complex random vectors:
Insignal processing, the above definition is often used without the normalization, that is, without subtracting the mean and dividing by the variance. When the autocorrelation function is normalized by mean and variance, it is sometimes referred to as theautocorrelation coefficient[4] or autocovariance function.
Given asignal, the continuous autocorrelation is most often defined as the continuouscross-correlation integral of with itself, at lag.[1]: 411
where represents thecomplex conjugate of. Note that the parameter in the integral is a dummy variable and is only necessary to calculate the integral. It has no specific meaning.
The discrete autocorrelation at lag for a discrete-time signal is
The above definitions work for signals that are square integrable, or square summable, that is, of finite energy. Signals that "last forever" are treated instead as random processes, in which case different definitions are needed, based on expected values. Forwide-sense-stationary random processes, the autocorrelations are defined as
For processes that are notstationary, these will also be functions of, or.
For processes that are alsoergodic, the expectation can be replaced by the limit of a time average. The autocorrelation of an ergodic process is sometimes defined as or equated to[4]
These definitions have the advantage that they give sensible well-defined single-parameter results for periodic functions, even when those functions are not the output of stationary ergodic processes.
Alternatively, signals thatlast forever can be treated by a short-time autocorrelation function analysis, using finite time integrals. (Seeshort-time Fourier transform for a related process.)
In the following, we will describe properties of one-dimensional autocorrelations only, since most properties are easily transferred from the one-dimensional case to the multi-dimensional cases. These properties hold forwide-sense stationary processes.[5]
A fundamental property of the autocorrelation is symmetry,, which is easy to prove from the definition. In the continuous case,
the autocorrelation is aneven function when is a real function, and
The continuous autocorrelation function reaches its peak at the origin, where it takes a real value, i.e. for any delay,.[1]: 410 This is a consequence of therearrangement inequality. The same result holds in the discrete case.
The autocorrelation of aperiodic function is, itself, periodic with the same period.
The autocorrelation of the sum of two completely uncorrelated functions (the cross-correlation is zero for all) is the sum of the autocorrelations of each function separately.
Since autocorrelation is a specific type ofcross-correlation, it maintains all the properties of cross-correlation.
By using the symbol to representconvolution and is a function which manipulates the function and is defined as, the definition for may be written as:
When mean values are subtracted from signals before computing an autocorrelation function, the resulting function is usually called an auto-covariance function.
For data expressed as adiscrete sequence, it is frequently necessary to compute the autocorrelation with highcomputational efficiency. Abrute force method based on the signal processing definition can be used when the signal size is small. For example, to calculate the autocorrelation of the real signal sequence (i.e., and for all other values ofi) by hand, we first recognize that the definition just given is the same as the "usual" multiplication, but with right shifts, where each vertical addition gives the autocorrelation for particular lag values:
Thus the required autocorrelation sequence is, where and the autocorrelation for other lag values being zero. In this calculation we do not perform the carry-over operation during addition as is usual in normal multiplication. Note that we can halve the number of operations required by exploiting the inherent symmetry of the autocorrelation. If the signal happens to be periodic, i.e. then we get a circular autocorrelation (similar tocircular convolution) where the left and right tails of the previous autocorrelation sequence will overlap and give which has the same period as the signal sequence The procedure can be regarded as an application of the convolution property ofZ-transform of a discrete signal.
While the brute force algorithm isordern2, several efficient algorithms exist which can compute the autocorrelation in ordern log(n). For example, theWiener–Khinchin theorem allows computing the autocorrelation from the raw dataX(t) with twofast Fourier transforms (FFT):[6][page needed]
Alternatively, a multipleτ correlation can be performed by using brute force calculation for lowτ values, and then progressively binning theX(t) data with alogarithmic density to compute higher values, resulting in the samen log(n) efficiency, but with lower memory requirements.[7][8]
For adiscrete process with known mean and variance for which we observe observations, an estimate of the autocorrelation coefficient may be obtained as
for any positive integer. When the true mean and variance are known, this estimate isunbiased. If the true mean andvariance of the process are not known there are several possibilities:
If and are replaced by the standard formulae for sample mean and sample variance, then this is abiased estimate.
Aperiodogram-based estimate replaces in the above formula with. This estimate is always biased; however, it usually has a smallermean squared error.[9][10]
Other possibilities derive from treating the two portions of data and separately and calculating separate sample means and/or sample variances for use in defining the estimate.[citation needed]
The advantage of estimates of the last type is that the set of estimated autocorrelations, as a function of, then form a function which is a valid autocorrelation in the sense that it is possible to define a theoretical process having exactly that autocorrelation. Other estimates can suffer from the problem that, if they are used to calculate the variance of a linear combination of the's, the variance calculated may turn out to be negative.[11]
In time series analysis, Hassani’s −1/2 theorem is a finite-sample identity for the conventional estimator of thesample autocorrelation function (ACF). For a series of length, using the usual sample-mean–corrected estimator, Hassani showed that the sum of the sample autocorrelations over all positive lags is constant:The result implies that sample autocorrelations across lags are not independent and that the sample ACF cannot be “positive overall” across all lags. It is often cited in discussions of finite-sample behavior of the ACF and in cautions against using the sum of estimated autocorrelations as a diagnostic measure of total dependence or long-memory behavior.
Inordinary least squares (OLS), the adequacy of a model specification can be checked in part by establishing whether there is autocorrelation of theregression residuals. Problematic autocorrelation of the errors, which themselves are unobserved, can generally be detected because it produces autocorrelation in the observable residuals. (Errors are also known as "error terms" ineconometrics.) Autocorrelation of the errors violates the ordinary least squares assumption that the error terms are uncorrelated, meaning that theGauss Markov theorem does not apply, and that OLS estimators are no longer the Best Linear Unbiased Estimators (BLUE). While it does not bias the OLS coefficient estimates, thestandard errors tend to be underestimated (and thet-scores overestimated) when the autocorrelations of the errors at low lags are positive.
The traditional test for the presence of first-order autocorrelation is theDurbin–Watson statistic or, if the explanatory variables include a lagged dependent variable,Durbin's h statistic. The Durbin-Watson can be linearly mapped however to the Pearson correlation between values and their lags.[12] A more flexible test, covering autocorrelation of higher orders and applicable whether or not the regressors include lags of the dependent variable, is theBreusch–Godfrey test. This involves an auxiliary regression, wherein the residuals obtained from estimating the model of interest are regressed on (a) the original regressors and (b)k lags of the residuals, where 'k' is the order of the test. The simplest version of thetest statistic from this auxiliary regression isTR2, whereT is the sample size andR2 is thecoefficient of determination. Under thenull hypothesis of no autocorrelation, this statistic is asymptoticallydistributed as withk degrees of freedom.
In the estimation of amoving average model (MA), the autocorrelation function is used to determine the appropriate number of lagged error terms to be included. This is based on the fact that for an MA process of orderq, we have, for, and, for.
Autocorrelation is used to analyzedynamic light scattering data, which notably enables determination of theparticle size distributions of nanometer-sized particles ormicelles suspended in a fluid. A laser shining into the mixture produces aspeckle pattern that results from the motion of the particles. Autocorrelation of the signal can be analyzed in terms of the diffusion of the particles. From this, knowing the viscosity of the fluid, the sizes of the particles can be calculated.
Utilized in theGPS system to correct for thepropagation delay, or time shift, between the point of time at the transmission of thecarrier signal at the satellites, and the point of time at the receiver on the ground. This is done by the receiver generating a replica signal of the 1,023-bit C/A (Coarse/Acquisition) code, and generating lines of code chips [-1,1] in packets of ten at a time, or 10,230 chips (1,023 × 10), shifting slightly as it goes along in order to accommodate for thedoppler shift in the incoming satellite signal, until the receiver replica signal and the satellite signal codes match up.[16]
Inmusic, autocorrelation (when applied at time scales smaller than a second) is used as apitch detection algorithm for both instrument tuners and "Auto Tune" (used as adistortion effect or to fix intonation).[18] When applied at time scales larger than a second, autocorrelation can identify themusical beat, for example to determinetempo.
Autocorrelation in space rather than time, via thePatterson function, is used by X-ray diffractionists to help recover the "Fourier phase information" on atom positions not available through diffraction alone.
In statistics, spatial autocorrelation between sample locations also helps one estimatemean value uncertainties when sampling a heterogeneous population.
TheSEQUEST algorithm for analyzingmass spectra makes use of autocorrelation in conjunction withcross-correlation to score the similarity of an observed spectrum to an idealized spectrum representing apeptide.
Inastrophysics, autocorrelation is used to study and characterize the spatial distribution ofgalaxies in the universe and in multi-wavelength observations of low massX-ray binaries.
Inpanel data, spatial autocorrelation refers to correlation of a variable with itself through space.
In analysis ofMarkov chain Monte Carlo data, autocorrelation must be taken into account for correct error determination.
Serial dependence is closely linked to the notion of autocorrelation, but represents a distinct concept (seeCorrelation and dependence). In particular, it is possible to have serial dependence but no (linear) correlation. In some fields however, the two terms are used as synonyms.
Atime series of arandom variable has serial dependence if the value at some time in the series isstatistically dependent on the value at another time. A series is serially independent if there is no dependence between any pair.
If a time series isstationary, then statistical dependence between the pair would imply that there is statistical dependence between all pairs of values at the same lag.
^abcdefgGubner, John A. (2006).Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press.ISBN978-0-521-86470-1.
^abcdefKun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018,ISBN978-3-319-68074-3
^abcPapoulis, Athanasius,Probability, Random variables and Stochastic processes, McGraw-Hill, 1991
^abDunn, Patrick F. (2005).Measurement and Data Analysis for Engineering and Science. New York: McGraw–Hill.ISBN978-0-07-282538-1.
^Proakis, John (August 31, 2001).Communication Systems Engineering (2nd Edition) (2 ed.). Pearson. p. 168.ISBN978-0130617934.
^Box, G. E. P.; Jenkins, G. M.; Reinsel, G. C. (1994).Time Series Analysis: Forecasting and Control (3rd ed.). Upper Saddle River, NJ: Prentice–Hall.ISBN978-0130607744.
^Percival, Donald B. (1993). "Three Curious Properties of the Sample Variance and Autocovariance for Stationary Processes with Unknown Mean".The American Statistician.47 (4):274–276.doi:10.1080/00031305.1993.10475997.
^Van Sickle, Jan (2008).GPS for Land Surveyors (Third ed.). CRC Press. pp. 18–19.ISBN978-0-8493-9195-8.
^Kalvani, Payam Rajabi; Jahangiri, Ali Reza; Shapouri, Samaneh; Sari, Amirhossein; Jalili, Yousef Seyed (August 2019). "Multimode AFM analysis of aluminum-doped zinc oxide thin films sputtered under various substrate temperatures for optoelectronic applications".Superlattices and Microstructures.132 106173.doi:10.1016/j.spmi.2019.106173.S2CID198468676.
Soltanalian, Mojtaba; Stoica, Petre (2012). "Computational Design of Sequences with Good Correlation Properties".IEEE Transactions on Signal Processing.60 (5): 2180.Bibcode:2012ITSP...60.2180S.doi:10.1109/TSP.2012.2186134.
Hassani, Hossein (2009). Sum of the sample autocorrelation function. Random Operators and Stochastic Equations. 17 (2): pp. 125–130.doi:10.1515/ROSE.2009.008.
Hassani, Hossein (2010). A note on the sum of the sample autocorrelation function]. Physica A: Statistical Mechanics and its Applications. 389 (8): pp. 1601–1606.doi:10.1016/j.physa.2009.12.050.