Insights Blog
-- Browse All Articles --
Physics Articles
Physics Tutorials
Physics Guides
Physics FAQ
Math Articles
Math Tutorials
Math Guides
Math FAQ
Education Articles
Education Guides
Bio/Chem Articles
Technology Guides
Computer Science Tutorials
Forums
General Math
Calculus
Differential Equations
Topology and Analysis
Linear and Abstract Algebra
Differential Geometry
Set Theory, Logic, Probability, Statistics
MATLAB, Maple, Mathematica, LaTeX
Trending
Featured Threads
Log in
Register
What's new
Search
Search
Search titles only
By:
General Math
Calculus
Differential Equations
Topology and Analysis
Linear and Abstract Algebra
Differential Geometry
Set Theory, Logic, Probability, Statistics
MATLAB, Maple, Mathematica, LaTeX
Menu
Log in
Register
Navigation
More options
Contact us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Mathematics
General Math
Optimizing Regression Degree with Weighted Cost Function
Reply to thread
Message
[QUOTE="andrewcheong, post: 1679195, member: 34083"] Hello, all. I know what I want, but I just don't know what it's called. This has to do with regression (polynomial fits). Given a set of N (x,y) points, we can compute a regression of degree K. For example, we could have a hundred (x,y) points and compute a linear regression (degree 1). Of course, there would be residual error because the line-of-best-fit won't go through every point perfectly. We could also compute quadratic (degree 2) or higher-degree regressions. This should reduce the residual error, or at least, be no worse an estimate than the lower-degree regressions. Now, what I want is a regression that determines the "best degree". I mean, if I have N points, I can always get a perfect fit by computing a regression of degree N-1. For example, if I only have 2 points, a 1-degree regression (linear) can fit both points perfectly. If I only have 3 points, a 2-degree regression (quadratic) can fit all three points perfectly, etc. So if I have a 100 points, one might say that a 99-degree regression is the "best degree". However, I look at higher-degrees as a cost. I want a method of determining a regression with a balance between low residual errors and low degree. I imagine that there must be some sort of a "cost" parameter that I have to set, because the computer alone cannot say what the "right" balance between residual error and degree is. Can anyone point me to the name of such a technique? Perhaps the most common used form of it? I want to apply this to stock market prices. As human beings, we can look at a plot of stock prices and mentally "fit" a smooth curve across the points that makes sense. But how does a computer do this? We can't just tell it to do a perfect fit, because then it'll do an N-1 degree fit (e.g. cubic B-splines). Thanks in advance! [/QUOTE]
Insert quotes…
Post reply
Forums
Mathematics
General Math
Optimizing Regression Degree with Weighted Cost Function
Back
Top