HOME

TheInfoList



OR:

In
optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
, a descent direction is a vector \mathbf\in\mathbb R^n that points towards a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. Computing \mathbf^* by an iterative method, such as
line search In optimization, line search is a basic iterative approach to find a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. It first finds a descent direction along which the objective function f will be reduced, and then co ...
defines a descent direction \mathbf_k\in\mathbb R^n at the kth iterate to be any \mathbf_k such that \langle\mathbf_k,\nabla f(\mathbf_k)\rangle < 0, where \langle , \rangle denotes the
inner product In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
. The motivation for such an approach is that small steps along \mathbf_k guarantee that \displaystyle f is reduced, by
Taylor's theorem In calculus, Taylor's theorem gives an approximation of a k-times differentiable function around a given point by a polynomial of degree k, called the k-th-order Taylor polynomial. For a smooth function, the Taylor polynomial is the truncation a ...
. Using this definition, the negative of a non-zero gradient is always a descent direction, as \langle -\nabla f(\mathbf_k), \nabla f(\mathbf_k) \rangle = -\langle \nabla f(\mathbf_k), \nabla f(\mathbf_k) \rangle < 0 . Numerous methods exist to compute descent directions, all with differing merits, such as
gradient descent Gradient descent is a method for unconstrained mathematical optimization. It is a first-order iterative algorithm for minimizing a differentiable multivariate function. The idea is to take repeated steps in the opposite direction of the gradi ...
or the
conjugate gradient method In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-semidefinite. The conjugate gradient method is often implemented as an it ...
. More generally, if P is a positive definite matrix, then p_k = -P \nabla f(x_k) is a descent direction at x_k. This generality is used in preconditioned gradient descent methods.


See also

*
Directional derivative In multivariable calculus, the directional derivative measures the rate at which a function changes in a particular direction at a given point. The directional derivative of a multivariable differentiable (scalar) function along a given vect ...


References

{{DEFAULTSORT:Descent Direction Mathematical optimization