Gradient descent describes the process of finding a local minimum of a function by following the negative value of the gradient at each point stepwise. Notationally, this is described in the following way:

a_{i+1} = a_i - \eta \nabla f(a_i).

Here, a_i refers to the i'th step, \eta is the step size and f is the function we want the minimum of.

09:35 Saturday 15 May 2021