What Exactly is Step Size in Gradient Descent Method?

Click For Summary
SUMMARY

The step size, denoted as α, in the gradient descent method is a crucial tuning parameter that determines the magnitude of the update to the position vector in the optimization process. It is defined in the context of the formula $$ x_{n+1} = x_n - \alpha \nabla f(x_n) $$, where a well-chosen α ensures efficient convergence to a local or global minimum. If α is too small, the algorithm will take excessive time to converge, while a value that is too large risks overshooting the minimum. The step size can be adjusted dynamically based on the gradient's magnitude to optimize the convergence rate.

PREREQUISITES
  • Understanding of gradient descent algorithm
  • Familiarity with vector calculus
  • Knowledge of optimization techniques in machine learning
  • Basic comprehension of tuning parameters in algorithms
NEXT STEPS
  • Research "Adaptive Learning Rate Methods in Gradient Descent"
  • Study "Learning Rate Schedules for Gradient Descent"
  • Explore "Gradient Descent Variants: Stochastic and Mini-batch"
  • Read "Understanding the Convergence of Gradient Descent"
USEFUL FOR

Data scientists, machine learning engineers, and anyone involved in optimizing algorithms using gradient descent will benefit from this discussion.

Dario56
Messages
289
Reaction score
48
Gradient descent is numerical optimization method for finding local/global minimum of function. It is given by following formula: $$ x_{n+1} = x_n - \alpha \nabla f(x_n) $$ There is countless content on internet about this method use in machine learning. However, there is one thing I don't understand and which I couldn't find even though it is basic.

What exactly is step size ## \alpha ## ?

Wikipedia states that it is tunning parameter in optimization algorithm which I understand, but not enough is being said about it to be considered a definition. Dimension analysis states that its dimensions should be ## \frac {(\Delta x )^2} {\Delta y} ## which I am not sure how to interpret.
 
Last edited:
Physics news on Phys.org
Because it’s a tuning parameter, you must choose one wisely. Too small and your algorithm will run too long, too large and you will miss the valley or hill you’re trying to locate.

Here’s some discussion on it:

https://blog.datumbox.com/tuning-the-learning-rate-in-gradient-descent/

The blogger says it may be obsolete theory but I think it may still apply to what you’re asking about.
 
  • Like
Likes   Reactions: Dario56
jedishrfu said:
Because it’s a tuning parameter, you must choose one wisely. Too small and your algorithm will run too long, too large and you will miss the valley or hill you’re trying to locate.

Here’s some discussion on it:

https://blog.datumbox.com/tuning-the-learning-rate-in-gradient-descent/

The blogger says it may be obsolete theory but I think it may still apply to what you’re asking about.
Thank you. I've got this in the meantime.

I had problem with understanding this parameter because I didn't look at the equation of gradient descent in vector form and it should be seen in this light since gradient is a vector valued function.

Parameter ##\alpha## basically defines how long in the direction of the gradient vector we want to go. If parameter has value for example 0,5, it means we move in the opposite direction (opposite because we subtract it from position vector ##x_n##) of the gradient vector by length equal to 0,5 value of gradient vector at the point ##x_n##.

Its value can be changed during optimization. If it is too big we can miss the minimum and if it is too small it can get too many iterations to converge.

I would say if value of gradient is big step size can be bigger and if gradient value is small that means we are close and so we need to make step size smaller not to miss the minimum we are close to.
 

Similar threads

  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 15 ·
Replies
15
Views
13K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
11
Views
7K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K