Hi there,(adsbygoogle = window.adsbygoogle || []).push({});

I'm working on a problem right now that relates to least squares error estimate for polynomial fitting.

I'm aware of techniques (iterative formulas) for finding the coefficients of a polynomial that minimizes the square error from a data set. So for example, for a data set that I believe is a first order polynomial, y(x) = a0 + a1*x, I have a method for finding the coefficients of a0 and a1.

However, what if I were to make an assumption about one of the coefficients, specifically, let's says I want a0 = 0. The question is then, let's find y(x) = a1*x, such that a1 minimizes the square error.

How would I go about solving this? The only method I know attempts to solve for a0 and a1, yet I'm making an assumption about a0 = 0, and am attempting to find a1.

Does anyone know how to solve this problem? Or if there is a general solution to solve this? I believe the problem is really a calculus of variation problems, and I simply do not know the math.

TLDR - What is the method for solving for polynomial coefficients (in a least squares sense), while being able to make assumptions about certain parameters.

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# A Solving polynomial coefficients to minimize square error

Tags:

Have something to add?

Draft saved
Draft deleted

Loading...

Similar Threads - Solving polynomial coefficients | Date |
---|---|

I Solving integral | Feb 9, 2018 |

I Understanding an integral | Jan 31, 2018 |

I Help solving a Polynomial | Jun 18, 2017 |

Is there a quick analytic way to solve this polynomial? | May 23, 2013 |

Solving Polynomial Equations | Sep 25, 2006 |

**Physics Forums - The Fusion of Science and Community**