• Mathematics
    VectorsMatricesVector SpaceAlgebraCalculusLogicAnalytic GeometryStatistics & ProbabilityGeometrySpecial Functions
    Physics
    MechanicsUnits & Constants
    Electronics
    Circuits
    Computer Science
    EncodingComputerLanguages
    Machine Learning
    ClusteringOptimizationRegressionKernels
    AI
    Neural Network
    Finance
    OptionsFixed IncomeMarket Analysis
    Help
    Contact usIndexSearchVersion history
    Practice Quiz
    LoginUser
  • Machine Learning
    Optimization: Gradient Descent
    • Clustering
      • Optimization
        • Gradient Descent
      • Regression
        • Kernels
          Iterative first-order optimization method for finding a local minimum (maximum) of a differentiable multi-variable function F : ℝn → ℝ
          Algorithm takes repeated steps in the opposite direction of the gradient (or approximate gradient) of the function at the current point in each step.
          Note:
          • Algorithm finds local minimum (maximum) which not necessarily global minimum (maximum).
          • Process is iterative and the solution might not be found if the number of iterations needed exceeds maximum number of steps
          • Result is not guaranteed if the function F is not differentiable or convex (concave for maximum)
          • Result is not guaranteed if the ∇F does not satisfy Lipschitz continuity
          • Choice of the starting point in the iterative process is random.
            This might affect whether the solution is found in the predetermined maximum number of steps.
            It also affects which local minimum (maximum) is found if there is more then one.
          What are you looking for Please enter the function whose Minimum you are looking for:F=
          Minimum of function F is
          Reached for
          After steps
          With error ε <
          Using Gradient:


          See Also:     1st and 2nd order derivatives ,
          Directional Derivative