In optimization, a gradient method is an algorithm to solve problems of the form
min x ∈ R n f ( x ) {\displaystyle \min _{x\in \mathbb {R} ^{n}}\;f(x)}with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.
We don't have any images related to Gradient method yet.
You can add one yourself here.
We don't have any YouTube videos related to Gradient method yet.
You can add one yourself here.
We don't have any PDF documents related to Gradient method yet.
You can add one yourself here.
We don't have any Books related to Gradient method yet.
You can add one yourself here.
We don't have any archived web articles related to Gradient method yet.
See also
- Gradient descent
- Stochastic gradient descent
- Coordinate descent
- Frank–Wolfe algorithm
- Landweber iteration
- Random coordinate descent
- Conjugate gradient method
- Derivation of the conjugate gradient method
- Nonlinear conjugate gradient method
- Biconjugate gradient method
- Biconjugate gradient stabilized method
- Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.