In the unconstrained
minimization problem, the Wolfe conditions are a set of inequalities for performing inexact
line search
In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. The other approach is trust region.
The line search approach first finds a d ...
, especially in
quasi-Newton methods, first published by
Philip Wolfe in 1969.
In these methods the idea is to find
::
for some
smooth
Smooth may refer to:
Mathematics
* Smooth function, a function that is infinitely differentiable; used in calculus and topology
* Smooth manifold, a differentiable manifold for which all the transition maps are smooth functions
* Smooth algebraic ...
. Each step often involves approximately solving the subproblem
::
where
is the current best guess,
is a search direction, and
is the step length.
The inexact line searches provide an efficient way of computing an acceptable step length
that reduces the
objective function
In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cos ...
'sufficiently', rather than minimizing the objective function over
exactly. A line search algorithm can use Wolfe conditions as a requirement for any guessed
, before finding a new search direction
.
Armijo rule and curvature
A step length
is said to satisfy the ''Wolfe conditions'', restricted to the direction
, if the following two inequalities hold:
:
with