Insights Blog
-- Browse All Articles --
Physics Articles
Physics Tutorials
Physics Guides
Physics FAQ
Math Articles
Math Tutorials
Math Guides
Math FAQ
Education Articles
Education Guides
Bio/Chem Articles
Technology Guides
Computer Science Tutorials
Forums
Intro Physics Homework Help
Advanced Physics Homework Help
Precalculus Homework Help
Calculus Homework Help
Bio/Chem Homework Help
Engineering Homework Help
Trending
Featured Threads
Log in
Register
What's new
Search
Search
Search titles only
By:
Intro Physics Homework Help
Advanced Physics Homework Help
Precalculus Homework Help
Calculus Homework Help
Bio/Chem Homework Help
Engineering Homework Help
Menu
Log in
Register
Navigation
More options
Contact us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Homework Help
Calculus and Beyond Homework Help
How Does the Least Squares Estimator Minimize Error in Linear Regression?
Reply to thread
Message
[QUOTE="andrewkirk, post: 5494932, member: 265790"] There seems to be something odd about how this problem is stated. It asks the student to assume that ##\hat\beta## is the least squares [I]estimator[/I] of ##\beta## - and then to use that to prove that it is the least squares [I]estimate.[/I] Are they trying to draw a distinction between estimator and estimate? If not, the problem is trivial. However if we want to get very precise about terminology I would have thought that an [I]estimator[/I] is a function whereas the [I]estimate[/I] is the result of the function. Is there some particular meaning of 'estimator' and 'estimate' that they are using in your course? As to how to proceed to prove their formula, yes substitution along the lines you mention sounds a good way to start. You can rewrite the RHS of the hint as ##(Y-X\hat\beta)+X(\hat\beta-\beta)##. Expanding out then gives us a right hand side that is what they show above, plus $$2(X(\hat\beta-\beta))^T(Y-X\hat\beta)$$ So this needs to be shown to be zero. However it seems to me that should be impossible, since it is a function of the unknown parameter vector ##\beta##, which can be changed without changing any of the other elements in the formula (##X,Y,\hat\beta##) . Are you sure there wasn't an expectation operator around that equation they want you to prove, or some other constraining condition? [/QUOTE]
Insert quotes…
Post reply
Forums
Homework Help
Calculus and Beyond Homework Help
How Does the Least Squares Estimator Minimize Error in Linear Regression?
Back
Top