Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Partial autocorrelation function

From Wikipedia, the free encyclopedia
Partial correlation of a time series with its lagged values
Partial autocorrelation function ofLake Huron's depth with confidence interval (in blue, plotted around 0)

Intime series analysis, thepartial autocorrelation function (PACF) gives thepartial correlation of a stationary time series with its own lagged values, regressed the values of the time series at all shorter lags. It contrasts with theautocorrelation function, which does not control for other lags.

This function plays an important role in data analysis aimed at identifying the extent of the lag in anautoregressive (AR) model. The use of this function was introduced as part of theBox–Jenkins approach to time series modelling, whereby plotting the partial autocorrelative functions one could determine the appropriate lagsp in an AR (p)model or in an extendedARIMA (p,d,q) model.

Definition

[edit]

Given a time serieszt{\displaystyle z_{t}}, the partial autocorrelation of lagk{\displaystyle k}, denotedϕk,k{\displaystyle \phi _{k,k}}, is theautocorrelation betweenzt{\displaystyle z_{t}} andzt+k{\displaystyle z_{t+k}} with the linear dependence ofzt{\displaystyle z_{t}} onzt+1{\displaystyle z_{t+1}} throughzt+k1{\displaystyle z_{t+k-1}} removed. Equivalently, it is the autocorrelation betweenzt{\displaystyle z_{t}} andzt+k{\displaystyle z_{t+k}} that is not accounted for by lags1{\displaystyle 1} throughk1{\displaystyle k-1}, inclusive.[1]ϕ1,1=corr(zt+1,zt), for k=1,{\displaystyle \phi _{1,1}=\operatorname {corr} (z_{t+1},z_{t}),{\text{ for }}k=1,}ϕk,k=corr(zt+kz^t+k,ztz^t), for k2,{\displaystyle \phi _{k,k}=\operatorname {corr} (z_{t+k}-{\hat {z}}_{t+k},\,z_{t}-{\hat {z}}_{t}),{\text{ for }}k\geq 2,}wherez^t+k{\displaystyle {\hat {z}}_{t+k}} andz^t{\displaystyle {\hat {z}}_{t}} arelinear combinations of{zt+1,zt+2,...,zt+k1}{\displaystyle \{z_{t+1},z_{t+2},...,z_{t+k-1}\}} that minimize themean squared error ofzt+k{\displaystyle z_{t+k}} andzt{\displaystyle z_{t}} respectively. Forstationary processes, the coefficients inz^t+k{\displaystyle {\hat {z}}_{t+k}} andz^t{\displaystyle {\hat {z}}_{t}} are the same, but reversed:[2]z^t+k=β1zt+k1++βk1zt+1andz^t=β1zt+1++βk1zt+k1.{\displaystyle {\hat {z}}_{t+k}=\beta _{1}z_{t+k-1}+\cdots +\beta _{k-1}z_{t+1}\qquad {\text{and}}\qquad {\hat {z}}_{t}=\beta _{1}z_{t+1}+\cdots +\beta _{k-1}z_{t+k-1}.}

Calculation

[edit]

The theoretical partial autocorrelation function of a stationary time series can be calculated by using the Durbin–Levinson Algorithm:ϕn,n=ρ(n)k=1n1ϕn1,kρ(nk)1k=1n1ϕn1,kρ(k){\displaystyle \phi _{n,n}={\frac {\rho (n)-\sum _{k=1}^{n-1}\phi _{n-1,k}\rho (n-k)}{1-\sum _{k=1}^{n-1}\phi _{n-1,k}\rho (k)}}}whereϕn,k=ϕn1,kϕn,nϕn1,nk{\displaystyle \phi _{n,k}=\phi _{n-1,k}-\phi _{n,n}\phi _{n-1,n-k}} for1kn1{\displaystyle 1\leq k\leq n-1} andρ(n){\displaystyle \rho (n)} is the autocorrelation function.[3][4][5]

The formula above can be used with sample autocorrelations to find the sample partial autocorrelation function of any given time series.[6][7]

Examples

[edit]

The following table summarizes the partial autocorrelation function of different models:[5][8]

ModelPACF
White noiseThe partial autocorrelation is 0 for all lags.
Autoregressive modelThe partial autocorrelation for an AR(p) model is nonzero for lags less than or equal top and 0 for lags greater thanp.
Moving-average modelIfϕ1,1>0{\displaystyle \phi _{1,1}>0}, the partial autocorrelationoscillates to 0.
Ifϕ1,1<0{\displaystyle \phi _{1,1}<0}, the partial autocorrelationgeometrically decays to 0.
Autoregressive–moving-average modelAn ARMA(p,q) model's partial autocorrelation geometrically decays to 0 but only after lags greater thanp.

The behavior of the partial autocorrelation function mirrors that of the autocorrelation function for autoregressive and moving-average models. For example, the partial autocorrelation function of an AR(p) series cuts off after lagp similar to the autocorrelation function of an MA(q) series with lagq. In addition, the autocorrelation function of an AR(p) process tails off just like the partial autocorrelation function of an MA(q) process.[2]

Autoregressive model identification

[edit]
The partial autocorrelation graph has 3 spikes and the rest is close to 0.
Sample partial autocorrelation function with confidence interval of a simulated AR(3) time series

Partial autocorrelation is a commonly used tool for identifying the order of an autoregressive model.[6] As previously mentioned, the partial autocorrelation of an AR(p) process is zero at lags greater thanp.[5][8] If an AR model is determined to be appropriate, then the sample partial autocorrelation plot is examined to help identify the order.

The partial autocorrelation of lags greater thanp for an AR(p) time series are approximately independent andnormal with amean of 0.[9] Therefore, aconfidence interval can be constructed by dividing a selectedz-score byn{\displaystyle {\sqrt {n}}}. Lags with partial autocorrelations outside of the confidence interval indicate that the AR model's order is likely greater than or equal to the lag. Plotting the partial autocorrelation function and drawing the lines of the confidence interval is a common way to analyze the order of an AR model. To evaluate the order, one examines the plot to find the lag after which the partial autocorrelations are all within the confidence interval. This lag is determined to likely be the AR model's order.[1]

References

[edit]
  1. ^ab"6.4.4.6.3. Partial Autocorrelation Plot".www.itl.nist.gov. Retrieved2022-07-14.
  2. ^abShumway, Robert H.; Stoffer, David S. (2017).Time Series Analysis and Its Applications: With R Examples. Springer Texts in Statistics. Cham: Springer International Publishing. pp. 97–99.doi:10.1007/978-3-319-52452-8.ISBN 978-3-319-52451-1.
  3. ^Durbin, J. (1960)."The Fitting of Time-Series Models".Revue de l'Institut International de Statistique / Review of the International Statistical Institute.28 (3):233–244.doi:10.2307/1401322.ISSN 0373-1138.JSTOR 1401322.
  4. ^Shumway, Robert H.; Stoffer, David S. (2017).Time Series Analysis and Its Applications: With R Examples. Springer Texts in Statistics. Cham: Springer International Publishing. pp. 103–104.doi:10.1007/978-3-319-52452-8.ISBN 978-3-319-52451-1.
  5. ^abcEnders, Walter (2004).Applied econometric time series (2nd ed.). Hoboken, NJ: J. Wiley. pp. 65–67.ISBN 0-471-23065-0.OCLC 52387978.
  6. ^abBox, George E. P.; Reinsel, Gregory C.; Jenkins, Gwilym M. (2008).Time Series Analysis: Forecasting and Control (4th ed.). Hoboken, New Jersey: John Wiley.ISBN 9780470272848.
  7. ^Brockwell, Peter J.; Davis, Richard A. (1991).Time Series: Theory and Methods (2nd ed.). New York, NY: Springer. pp. 102,243–245.ISBN 9781441903198.
  8. ^abDas, Panchanan (2019).Econometrics in Theory and Practice : Analysis of Cross Section, Time Series and Panel Data with Stata 15. 1. Singapore: Springer. pp. 294–299.ISBN 978-981-329-019-8.OCLC 1119630068.
  9. ^Quenouille, M. H. (1949)."Approximate Tests of Correlation in Time-Series".Journal of the Royal Statistical Society, Series B (Methodological).11 (1):68–84.doi:10.1111/j.2517-6161.1949.tb00023.x.
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Biostatistics
Engineering statistics
Social statistics
Spatial statistics
Retrieved from "https://en.wikipedia.org/w/index.php?title=Partial_autocorrelation_function&oldid=1301167084"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp