The previous observation is the basis of the method of steepest descent, which is also referred to as just gradient descent (GD). One starts with an initial guess x0 for a minimum of F and computes new approximations according to
xk+1=xk−γk∇F(xk), k≥0.The parameter γk is often referred to as the step length or the learning rate within the context of Machine Learning.