Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Berndt–Hall–Hall–Hausman algorithm

From Wikipedia, the free encyclopedia

TheBerndt–Hall–Hall–Hausman (BHHH)algorithm is anumerical optimizationalgorithm similar to theNewton–Raphson algorithm, but it replaces the observed negativeHessian matrix with theouter product of thegradient. This approximation is based on theinformation matrix equality and therefore only valid while maximizing alikelihood function.[1] The BHHH algorithm is named after the four originators:Ernst R. Berndt,Bronwyn Hall,Robert Hall, andJerry Hausman.[2]

Usage

[edit]

If anonlinear model is fitted to thedata one often needs to estimatecoefficients throughoptimization. A number of optimization algorithms have the following general structure. Suppose that the function to be optimized isQ(β). Then the algorithms are iterative, defining a sequence of approximations,βk given by

βk+1=βkλkAkQβ(βk),{\displaystyle \beta _{k+1}=\beta _{k}-\lambda _{k}A_{k}{\frac {\partial Q}{\partial \beta }}(\beta _{k}),},

whereβk{\displaystyle \beta _{k}} is the parameter estimate at step k, andλk{\displaystyle \lambda _{k}} is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithmλk is determined by calculations within a given iterative step, involving a line-search until a pointβk+1 is found satisfying certain criteria. In addition, for the BHHH algorithm,Q has the form

Q=i=1NQi{\displaystyle Q=\sum _{i=1}^{N}Q_{i}}

andA is calculated using

Ak=[i=1NlnQiβ(βk)lnQiβ(βk)]1.{\displaystyle A_{k}=\left[\sum _{i=1}^{N}{\frac {\partial \ln Q_{i}}{\partial \beta }}(\beta _{k}){\frac {\partial \ln Q_{i}}{\partial \beta }}(\beta _{k})'\right]^{-1}.}

In other cases, e.g.Newton–Raphson,Ak{\displaystyle A_{k}} can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.[citation needed]

See also

[edit]

References

[edit]
  1. ^Henningsen, A.; Toomet, O. (2011). "maxLik: A package for maximum likelihood estimation in R".Computational Statistics.26 (3): 443–458 [p. 450].doi:10.1007/s00180-010-0217-1.
  2. ^Berndt, E.; Hall, B.; Hall, R.; Hausman, J. (1974)."Estimation and Inference in Nonlinear Structural Models"(PDF).Annals of Economic and Social Measurement.3 (4):653–665.

Further reading

[edit]
  • V. Martin, S. Hurn, and D. Harris,Econometric Modelling with Time Series, Chapter 3 'Numerical Estimation Methods'. Cambridge University Press, 2015.
  • Amemiya, Takeshi (1985).Advanced Econometrics. Cambridge: Harvard University Press. pp. 137–138.ISBN 0-674-00560-0.
  • Gill, P.; Murray, W.; Wright, M. (1981).Practical Optimization. London: Harcourt Brace.
  • Gourieroux, Christian; Monfort, Alain (1995)."Gradient Methods and ML Estimation".Statistics and Econometric Models. New York: Cambridge University Press. pp. 452–458.ISBN 0-521-40551-3.
  • Harvey, A. C. (1990).The Econometric Analysis of Time Series (Second ed.). Cambridge: MIT Press. pp. 137–138.ISBN 0-262-08189-X.
Functions
Gradients
Convergence
Quasi–Newton
Other methods
Hessians
Graph of a strictly concave quadratic function with unique maximum.
Optimization computes maxima and minima.
General
Differentiable
Convex
minimization
Linear and
quadratic
Interior point
Basis-exchange
Paradigms
Graph
algorithms
Minimum
spanning tree
Shortest path
Network flows
Retrieved from "https://en.wikipedia.org/w/index.php?title=Berndt–Hall–Hall–Hausman_algorithm&oldid=1319450935"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp