Iterative first-order optimization method for finding a local minimum (maximum) of a differentiable multi-variable function F : ℝ^{n} → ℝ Algorithm takes repeated steps in the opposite direction of the gradient (or approximate gradient) of the function at the current point in each step. Note:

Algorithm finds local minimum (maximum) which not necessarily global minimum (maximum).

Process is iterative and the solution might not be found if the number of iterations needed exceeds maximum number of steps

Result is not guaranteed if the function F is not differentiable or convex (concave for maximum)

Result is not guaranteed if the ∇F does not satisfy Lipschitz continuity

Choice of the starting point in the iterative process is random. This might affect whether the solution is found in the predetermined maximum number of steps. It also affects which local minimum (maximum) is found if there is more then one.

What are you looking for Please enter the function whose Minimum you are looking for:F=