Linear Regression: Pros and cons of Normal vs. simplified methods?

  • #1
BrownishMonst
1
0
I'm currently looking at a linear regression handout from Uni and there are two methods to calculate the equation. The Normal one is to find a and b for y=a+bx, the equations for a and b are given in the handout but I'll assume you're familiar with them. The simplified one is using

[itex]y = Bx + (\overline{y} − B\overline{x}[/itex])​

The two produce slightly different values and I assume the normal one is more precise than the simplified.

What I'd like to know is which one would be best to use when? Is the difference negligible?

Thanks, and sorry if I've posted this in the wrong forum.
 

Answers and Replies

  • #2
Number Nine
813
25
The two are mathematically identical, so the difference is likely roundoff error due to the difference in computation.

[itex]y = Bx + (y - Bx)[/itex]
[itex]y = Bx + A[/itex]
[itex]y = A + Bx[/itex]

Note that the standard least squares estimates are the best linear unbiased estimates (as well as the maximum likelihood estimates), provided that the model assumptions are met. In this case, you would rarely want to use anything else.
 

Suggested for: Linear Regression: Pros and cons of Normal vs. simplified methods?

Replies
8
Views
583
  • Last Post
Replies
8
Views
390
  • Last Post
Replies
22
Views
339
Replies
0
Views
174
  • Last Post
2
Replies
45
Views
856
Replies
1
Views
421
Replies
3
Views
397
Replies
7
Views
873
  • Last Post
Replies
6
Views
197
  • Last Post
Replies
5
Views
620
Top