Linear Regression: Pros and cons of Normal vs. simplified methods?

  • #1
I'm currently looking at a linear regression handout from Uni and there are two methods to calculate the equation. The Normal one is to find a and b for y=a+bx, the equations for a and b are given in the handout but I'll assume you're familiar with them. The simplified one is using

[itex]y = Bx + (\overline{y} − B\overline{x}[/itex])​

The two produce slightly different values and I assume the normal one is more precise than the simplified.

What I'd like to know is which one would be best to use when? Is the difference negligible?

Thanks, and sorry if I've posted this in the wrong forum.
 

Answers and Replies

  • #2
807
23
The two are mathematically identical, so the difference is likely roundoff error due to the difference in computation.

[itex]y = Bx + (y - Bx)[/itex]
[itex]y = Bx + A[/itex]
[itex]y = A + Bx[/itex]

Note that the standard least squares estimates are the best linear unbiased estimates (as well as the maximum likelihood estimates), provided that the model assumptions are met. In this case, you would rarely want to use anything else.
 

Related Threads on Linear Regression: Pros and cons of Normal vs. simplified methods?

Replies
1
Views
774
Replies
7
Views
1K
Replies
2
Views
4K
Replies
9
Views
1K
Replies
4
Views
816
Replies
2
Views
2K
  • Last Post
Replies
2
Views
3K
Replies
2
Views
586
  • Last Post
Replies
16
Views
1K
Top