Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Solving Linear Least square

  1. Jul 11, 2007 #1
    Hi there,

    Could some one please tell me how can i solve this linear least square problem.

    x*,w* = argmin[tex]_{x,w}[/tex]|| Gx - Mw ||[tex]^{2}_{2}[/tex]

    subject to v[tex]_{k}[/tex] = c[tex]_{k}[/tex]; k = 1 ..... p

    v[tex]_{k}[/tex] is the vertex
    c is the constant

    Here x and w are the unknowns.

    G - is a 9m x 3n matrix - the coefficients here are known
    x - is 3n x 1 matrix of unknown vertices
    M - is 9m x 4 matrix the values are known here as well
    w - is 4 x 1 matrix of unknown values

    I have to solve for x and w. Any help would be really appreciated.
  2. jcsd
  3. Jul 11, 2007 #2
    How are v and c related to the problem?
  4. Jul 11, 2007 #3
    x = v[tex]^{1}[/tex] v[tex]^{2}[/tex] .... v[tex]^{k}[/tex]
    c = some constant values .... basically some x values we dont want to find or in other words want them to remain constant.
    Last edited: Jul 11, 2007
  5. Jul 11, 2007 #4
    As I see it you can partition G based on the components of x that are fixed

    so Gx=[G_1 G_2][v x_2] [\tex]

    now p=G_1v is a constant vector and the problem becomes

    min ||p-[G_2 M][x_2 y] || ... which is standard least square.

    Hope the above makes sense. The multiplying vector in all the above expressions is a column vector .. though I have left out the transposes etc.
  6. Jul 12, 2007 #5
    Hmmm i dont understand what u mean here. Thing is forget about constants and then solve the equation. Assuming lets say there are no constants then how do you solve the equation. Technically it should give me two normal equations.
  7. Jul 12, 2007 #6
    If you forget about constants then there is nothing stopping me from choosing x=w=0 and the minimum of the function is 0.

    Why do you say we should get two normal equations?

    Also, what in particular didn't make sense in the previous post?
  8. Jul 12, 2007 #7
    Why choose zeor for w and x. We have to minimize the equation for w and x and as far as i know the way was to take derivative with respect to each unknown vector and put it equal to zero and then solve the equation. This should give two normal equations. First take derivative with respect to x and put equal to zero and then take derivative with respect to w and put equal to zero. Thats what i think.

    I actually was looking for an answer for a generalized case. So that is why i am saying that there could be any arbitrary values for G and M and values of w and x are to be found so you cannot just assume that they have zero value.
  9. Jul 12, 2007 #8
    What I was trying to say was that the function to be minimized is the second norm of a vector so its smallest possible value is zero. Now if we do not have the constant component constraints then we can choose x=w=0 and achieve this minimum. So (0,0) is our solution.

    Now if we have the constant constraints then the above posted solution seems to be completely general. The only thing I have done is taken the variable components of x (named it x_2) and y and made them a single vector of a higher dimension. And the problem reduces to a standard least square in this higher dimensional space.
    Last edited: Jul 12, 2007
  10. Jul 12, 2007 #9
    Basically I am saying that the standard least square which is

    min || Ax-b || can also be written as

    min || A1 x1 + A2 x2 -b || where x1 and x2 are achieved by partitioning the vector x and A1 and A2 are corresponding partitions of columns of A.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Solving Linear Least square
  1. Linear least squares (Replies: 2)