Berndt–Hall–Hall–Hausman algorithm
   HOME

TheInfoList



OR:

The Berndt–Hall–Hall–Hausman (BHHH) algorithm is a
numerical optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algorithms are used as specificat ...
similar to the Newton–Raphson algorithm, but it replaces the observed negative
Hessian matrix In mathematics, the Hessian matrix or Hessian is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed ...
with the
outer product In linear algebra, the outer product of two coordinate vector In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers (a tuple) that describes the vector in terms of a particular ordered basis. An ea ...
of the
gradient In vector calculus, the gradient of a scalar-valued differentiable function of several variables is the vector field (or vector-valued function) \nabla f whose value at a point p is the "direction and rate of fastest increase". If the gradi ...
. This approximation is based on the
information matrix In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable ''X'' carries about an unknown parameter ''θ'' of a distribution that model ...
equality and therefore only valid while maximizing a
likelihood function The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood funct ...
. The BHHH algorithm is named after the four originators:
Ernst R. Berndt Ernst is both a surname and a given name, the German, Dutch, and Scandinavian form of Ernest. Notable people with the name include: Surname * Adolf Ernst (1832–1899) German botanist known by the author abbreviation "Ernst" * Anton Ernst (1975-) ...
,
Bronwyn Hall Bronwyn Hughes Hall is the Emerita Professor of Economics at the University of California at Berkeley. Education Hall received a B.A. in Physics from Wellesley College in 1966 and a Ph.D. in economics from Stanford University in 1988. Career S ...
, Robert Hall, and
Jerry Hausman Jerry Allen Hausman (born May 5, 1946) is the John and Jennie S. MacDonald Professor of Economics at the Massachusetts Institute of Technology and a notable econometrician. He has published numerous influential papers in microeconometrics. Haus ...
.


Usage

If a
nonlinear In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other ...
model is fitted to the
data In the pursuit of knowledge, data (; ) is a collection of discrete values that convey information, describing quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted ...
one often needs to estimate
coefficient In mathematics, a coefficient is a multiplicative factor in some term of a polynomial, a series, or an expression; it is usually a number, but may be any expression (including variables such as , and ). When the coefficients are themselves var ...
s through
optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
. A number of optimisation algorithms have the following general structure. Suppose that the function to be optimized is ''Q''(''β''). Then the algorithms are iterative, defining a sequence of approximations, ''βk'' given by :\beta_=\beta_-\lambda_A_\frac(\beta_),, where \beta_ is the parameter estimate at step k, and \lambda_ is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithm ''λk'' is determined by calculations within a given iterative step, involving a line-search until a point ''β''''k''+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, ''Q'' has the form :Q = \sum_^ Q_i and ''A'' is calculated using :A_=\left sum_^\frac(\beta_)\frac(\beta_)'\right . In other cases, e.g.
Newton–Raphson In numerical analysis, Newton's method, also known as the Newton–Raphson method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a real-va ...
, A_ can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.


See also

* Davidon–Fletcher–Powell (DFP) algorithm * Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm


References


Further reading

*V. Martin, S. Hurn, and D. Harris, ''Econometric Modelling with Time Series'', Chapter 3 'Numerical Estimation Methods'. Cambridge University Press, 2015. * * * * {{DEFAULTSORT:Bhhh Algorithm Estimation methods Optimization algorithms and methods