Why Is the First Derivative Zero in Least Squares Optimization?

  • Context: Undergrad 
  • Thread starter Thread starter Amany Gouda
  • Start date Start date
  • Tags Tags
    Method Square
Click For Summary

Discussion Overview

The discussion revolves around the concept of least squares optimization, specifically addressing why the first derivative of the error summation is equal to zero at the minimum point. Participants explore the theoretical underpinnings of this phenomenon.

Discussion Character

  • Exploratory, Technical explanation, Conceptual clarification, Debate/contested

Main Points Raised

  • Some participants suggest that the first derivative is zero because the method seeks to minimize the sum of squared errors, and at a minimum or maximum of a smooth function, the derivative is zero.
  • Others express a desire for proof of this assertion, questioning whether it is established that the derivative is zero at an extremum.
  • There is a mention of a proof existing, but some participants indicate they have not fully understood it.

Areas of Agreement / Disagreement

Participants generally agree on the reasoning behind the derivative being zero at a minimum, but there is no consensus on the existence or understanding of a formal proof for this fact.

Contextual Notes

Some participants express uncertainty regarding the proof and its logical steps, indicating a potential gap in understanding the theoretical framework.

Amany Gouda
Messages
29
Reaction score
0
Hello Sir,

I would studying the theory of least square and I find that the derivative of the error summation between the predicated line points and the true data is equal zero. Why the first derivative is equal zero?
 
Physics news on Phys.org
Amany Gouda said:
Hello Sir,

I would studying the theory of least square and I find that the derivative of the error summation between the predicated line points and the true data is equal zero. Why the first derivative is equal zero?
I'm partly guessing exactly what you did, but I suggest it is because the method finds the line that minimises the sum square of errors, and when a smooth function is at a maximum or minimum the slope (derivative) of the function is zero.
 
You are right, I have the same opinion regarding the answer. But is there a prove to this fact?
 
Amany Gouda said:
You are right, I have the same opinion regarding the answer. But is there a prove to this fact?
A proof of which fact? That at an extremum the derivative is zero?
 
  • Like
Likes   Reactions: Amany Gouda
Unfortunately, there is a prove but I didn't reach to it.
 
Amany Gouda said:
Unfortunately, there is a prove but I didn't reach to it.
What does this mean? Did you find a proof but were unable to follow the logic of it?
 

Similar threads

  • · Replies 53 ·
2
Replies
53
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
3
Views
992
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 18 ·
Replies
18
Views
3K