In
optimization
Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
, a descent direction is a vector
that points towards a local minimum
of an objective function
.
Computing
by an iterative method, such as
line search
In optimization, line search is a basic iterative approach to find a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. It first finds a descent direction along which the objective function f will be reduced, and then co ...
defines a descent direction
at the
th iterate to be any
such that
, where
denotes the
inner product
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
. The motivation for such an approach is that small steps along
guarantee that
is reduced, by
Taylor's theorem
In calculus, Taylor's theorem gives an approximation of a k-times differentiable function around a given point by a polynomial of degree k, called the k-th-order Taylor polynomial. For a smooth function, the Taylor polynomial is the truncation a ...
.
Using this definition, the negative of a non-zero gradient is always a
descent direction, as
.
Numerous methods exist to compute descent directions, all with differing merits, such as
gradient descent
Gradient descent is a method for unconstrained mathematical optimization. It is a first-order iterative algorithm for minimizing a differentiable multivariate function.
The idea is to take repeated steps in the opposite direction of the gradi ...
or the
conjugate gradient method
In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-semidefinite. The conjugate gradient method is often implemented as an it ...
.
More generally, if
is a
positive definite matrix, then
is a descent direction at
.
This generality is used in
preconditioned gradient descent methods.
See also
*
Directional derivative
In multivariable calculus, the directional derivative measures the rate at which a function changes in a particular direction at a given point.
The directional derivative of a multivariable differentiable (scalar) function along a given vect ...
References
{{DEFAULTSORT:Descent Direction
Mathematical optimization