Optimization techniques for Gradient Descent

Gradient Descent is an iterative optimiZation algorithm, used to find the minimum value for a function. The general idea is to initialize the parameters to random values, and then take small steps in the direction of the “slope” at each iteration. Gradient descent is highly used in supervised learning to minimize the error function and find the optimal values for the parameters.

Various extensions have been designed for gradient descent algorithm. Some of them are discussed below:

  • Momentum method: This method is used to accelerate the gradient descent algorithm by taking into consideration the exponentially weighted average of the gradients. Using averages makes the algorithm converge towards the minima in a faster way, as the gradients towards the uncommon directions are canceled out. The pseudocode for momentum method is given below.
    V = 0
    for each iteration i:
        compute dW
        V = β V + (1 - β) dW
        W = W - α V
    

    V and dW are analogous to acceleration and velocity respectively. α is the learning rate, and β is normally kept at 0.9.

  • RMSprop: RMSprop was proposed by University of Toronto’s Geoffrey Hinton. The intuition is to apply an exponentially weighted average method to the second moment of the gradients (dW2). The pseudocode for this is as follows:
    S = 0
    for each iteration i
        compute dW
        S = β S + (1 - β) dW2
        W = W - α dW√S + ε
    
  • Adam Optimization: Adam optimization algorithm incorporates the momentum method and RMSprop, along with bias correction. The pseudocode for this approach is as follows,
    V = 0
    S = 0
    for each iteration i
        compute dW
        V = β1 S + (1 - β1) dW
        S = β2 S + (1 - β2) dW2
        V = V{1 - β1i}
        S = S{1 - β2i}
        W = W - α V√S + ε
    

    Kingma and Ba, the proposers of Adam, recommended the following values for the hyperparameters.

    α = 0.001
    β1 = 0.9
    β2 = 0.999
    ε = 10-8
    


My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.



Improved By : vaibhav29498