Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Chow test

From Wikipedia, the free encyclopedia
Mathematical test proposed by Gregory Chow

TheChow test (Chinese:鄒檢定), proposed byeconometricianGregory Chow in 1960, is astatistical test of whether the true coefficients in twolinear regressions on different data sets are equal. In econometrics, it is most commonly used intime series analysis to test for the presence of astructural break at a period which can be assumed to be knowna priori (for instance, a major historical event such as a war). Inprogram evaluation, the Chow test is often used to determine whether the independent variables have different impacts on different subgroups of the population.

Illustrations

[edit]
Applications of the Chow test
Structural break (slopes differ)Program evaluation (intercepts differ)
Atx=1.7{\displaystyle x=1.7} there is a structural break; separate regressions on the subintervals[0,1.7]{\displaystyle [0,1.7]} and[1.7,4]{\displaystyle [1.7,4]} delivers a better model than the combined regression (dashed) over the whole interval.Comparison of two different programs (red, green) in a common data set: separate regressions for both programs deliver a better model than a combined regression (black).

First Chow Test

[edit]

Suppose that we model our data as

yt=a+bx1t+cx2t+ε.{\displaystyle y_{t}=a+bx_{1t}+cx_{2t}+\varepsilon .\,}

If we split our data into two groups, then we have

yt=a1+b1x1t+c1x2t+ε{\displaystyle y_{t}=a_{1}+b_{1}x_{1t}+c_{1}x_{2t}+\varepsilon \,}

and

yt=a2+b2x1t+c2x2t+ε.{\displaystyle y_{t}=a_{2}+b_{2}x_{1t}+c_{2}x_{2t}+\varepsilon .\,}

Thenull hypothesis of the Chow test asserts thata1=a2{\displaystyle a_{1}=a_{2}},b1=b2{\displaystyle b_{1}=b_{2}}, andc1=c2{\displaystyle c_{1}=c_{2}}, and there is the assumption that themodel errorsε{\displaystyle \varepsilon } areindependent and identically distributed from anormal distribution with unknownvariance.

LetSC{\displaystyle S_{C}} be the sum of squaredresiduals from the combined data,S1{\displaystyle S_{1}} be the sum of squared residuals from the first group, andS2{\displaystyle S_{2}} be the sum of squared residuals from the second group.N1{\displaystyle N_{1}} andN2{\displaystyle N_{2}} are the number of observations in each group andk{\displaystyle k} is the total number of parameters (in this case 3, i.e. 2 independent variables coefficients + intercept). Then the Chow test statistic is

(SC(S1+S2))/k(S1+S2)/(N1+N22k).{\displaystyle {\frac {(S_{C}-(S_{1}+S_{2}))/k}{(S_{1}+S_{2})/(N_{1}+N_{2}-2k)}}.}

The test statistic follows theF-distribution withk{\displaystyle k} andN1+N22k{\displaystyle N_{1}+N_{2}-2k}degrees of freedom.

The same result can be achieved via dummy variables.

Consider the two data sets which are being compared. Firstly there is the 'primary' data set i={1,...,n1{\displaystyle n_{1}}} and the 'secondary' data set i={n1{\displaystyle n_{1}}+1,...,n}. Then there is the union of these two sets: i={1,...,n}. If there is no structural change between the primary and secondary data sets a regression can be run over the union without the issue of biased estimators arising.

Consider the regression:

yt=β0+β1x1t+β2x2t+...+βkxkt+γ0Dt+i=1kγixitDt+εt.{\displaystyle y_{t}=\beta _{0}+\beta _{1}x_{1t}+\beta _{2}x_{2t}+...+\beta _{k}x_{kt}+\gamma _{0}D_{t}+\sum _{i=1}^{k}\gamma _{i}x_{it}D_{t}+\varepsilon _{t}.\,}

Which is run over i={1,...,n}.

D is a dummy variable taking a value of 1 for i={n1{\displaystyle n_{1}}+1,...,n} and 0 otherwise.

If both data sets can be explained fully by(β0,β1,...,βk){\displaystyle (\beta _{0},\beta _{1},...,\beta _{k})} then there is no use in the dummy variable as the data set is explained fully by the restricted equation. That is, under the assumption of no structural change we have a null and alternative hypothesis of:

H0:γ0=0,γ1=0,...,γk=0{\displaystyle H_{0}:\gamma _{0}=0,\gamma _{1}=0,...,\gamma _{k}=0}

H1:otherwise{\displaystyle H_{1}:{\text{otherwise}}}

The null hypothesis of joint insignificance of D can be run as an F-test withn2(k+1){\displaystyle n-2(k+1)} degrees of freedom (DoF). That is:F=(RSSRRSSU)/(k+1)RSSU/DoF{\displaystyle F={\frac {(RSS^{R}-RSS^{U})/(k+1)}{RSS^{U}/DoF}}}.

Remarks

  • The global sum of squares (SSE) is often called the Restricted Sum of Squares (RSSM) as we basically test a constrained model where we have2k{\displaystyle 2k} assumptions (withk{\displaystyle k} the number of regressors).
  • Some software like SAS will use a predictive Chow test when the size of a subsample is less than the number of regressors.

References

[edit]

External links

[edit]
Wikimedia Commons has media related toChow test.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Chow_test&oldid=1264489981"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp