HOME

TheInfoList



OR:

Powell's dog leg method, also called Powell's hybrid method, is an
iterative Iteration is the repetition of a process in order to generate a (possibly unbounded) sequence of outcomes. Each repetition of the process is a single iteration, and the outcome of each iteration is then the starting point of the next iteration. ...
optimisation Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
algorithm for the solution of
non-linear least squares Non-linear least squares is the form of least squares analysis used to fit a set of ''m'' observations with a model that is non-linear in ''n'' unknown parameters (''m'' ≥ ''n''). It is used in some forms of nonlinear regression. The ...
problems, introduced in 1970 by
Michael J. D. Powell Michael James David Powell (29 July 193619 April 2015) was a British mathematician, who worked in the Department of Applied Mathematics and Theoretical Physics (DAMTP) at the University of Cambridge. Education and early life Born in London, Pow ...
.Powell (1970) Similarly to the
Levenberg–Marquardt algorithm In mathematics and computing, the Levenberg–Marquardt algorithm (LMA or just LM), also known as the damped least-squares (DLS) method, is used to solve non-linear least squares problems. These minimization problems arise especially in least s ...
, it combines the Gauss–Newton algorithm with
gradient descent In mathematics, gradient descent (also often called steepest descent) is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of ...
, but it uses an explicit trust region. At each iteration, if the step from the Gauss–Newton algorithm is within the trust region, it is used to update the current solution. If not, the algorithm searches for the minimum of the
objective function In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cos ...
along the steepest descent direction, known as Cauchy point. If the Cauchy point is outside of the trust region, it is truncated to the boundary of the latter and it is taken as the new solution. If the Cauchy point is inside the trust region, the new solution is taken at the intersection between the trust region boundary and the line joining the Cauchy point and the Gauss-Newton step (dog leg step).Yuan (2000) The name of the method derives from the resemblance between the construction of the dog leg step and the shape of a dogleg hole in
golf Golf is a club-and-ball sport in which players use various clubs to hit balls into a series of holes on a course in as few strokes as possible. Golf, unlike most ball games, cannot and does not use a standardized playing area, and coping ...
.


Formulation

Given a
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the r ...
problem in the form : F(\boldsymbol) = \frac \left\, \boldsymbol (\boldsymbol) \right\, ^2 = \frac \sum_^m \left( f_i(\boldsymbol) \right)^2 with f_i: \mathbb^n \to \mathbb, Powell's dog leg method finds the optimal point \boldsymbol^* = \operatorname_ F(\boldsymbol) by constructing a
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is called ...
\boldsymbol_k = \boldsymbol_ + \delta_k that converges to \boldsymbol^*. At a given iteration, the Gauss–Newton step is given by : \boldsymbol = - \left( \boldsymbol^\top \boldsymbol \right)^ \boldsymbol^\top \boldsymbol(\boldsymbol) where \boldsymbol = \left( \frac \right) is the
Jacobian matrix In vector calculus, the Jacobian matrix (, ) of a vector-valued function of several variables is the matrix of all its first-order partial derivatives. When this matrix is square, that is, when the function takes the same number of variables ...
, while the
steepest descent In mathematics, gradient descent (also often called steepest descent) is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of t ...
direction is given by : \boldsymbol = - \boldsymbol^\top \boldsymbol(\boldsymbol) . The objective function is linearised along the steepest descent direction : \begin F(\boldsymbol + t \boldsymbol) &\approx \frac \left\, \boldsymbol(\boldsymbol) + t \boldsymbol(\boldsymbol) \boldsymbol \right\, ^2 \\ &= F(\boldsymbol) + t \boldsymbol^\top \boldsymbol^\top \boldsymbol(\boldsymbol) + \frac t^2 \left\, \boldsymbol \boldsymbol \right\, ^2 . \end To compute the value of the parameter t at the Cauchy point, the
derivative In mathematics, the derivative of a function of a real variable measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value). Derivatives are a fundamental tool of calculus. ...
of the last expression with respect to t is imposed to be equal to zero, giving : t = -\frac = \frac. Given a trust region of radius \Delta, Powell's dog leg method selects the update step \boldsymbol as equal to: * \boldsymbol, if the Gauss–Newton step is within the trust region (\left\, \boldsymbol \right\, \le \Delta); * \frac \boldsymbol if both the Gauss–Newton and the steepest descent steps are outside the trust region (t \left\, \boldsymbol \right\, ); * t \boldsymbol + s \left( \boldsymbol - t \boldsymbol \right) with s such that \left\, \boldsymbol \right\, = \Delta, if the Gauss–Newton step is outside the trust region but the steepest descent step is inside (dog leg step).


References


Sources

* * * *


External links

* {{DEFAULTSORT:Powell's dog leg method Least squares Optimization algorithms and methods