Optimizing Linear Least Squares with Multiple Unknowns: A Partitioned Approach

  • Thread starter Thread starter slylock
  • Start date Start date
  • Tags Tags
    Linear Square
slylock
Messages
4
Reaction score
0
Hi there,

Could some one please tell me how can i solve this linear least square problem.

x*,w* = argmin_{x,w}|| Gx - Mw ||^{2}_{2}

subject to v_{k} = c_{k}; k = 1 ... p

v_{k} is the vertex
c is the constant

Here x and w are the unknowns.

G - is a 9m x 3n matrix - the coefficients here are known
x - is 3n x 1 matrix of unknown vertices
M - is 9m x 4 matrix the values are known here as well
w - is 4 x 1 matrix of unknown values

I have to solve for x and w. Any help would be really appreciated.
Thanks
 
Physics news on Phys.org
How are v and c related to the problem?
 
x = v^{1} v^{2} ... v^{k}
c = some constant values ... basically some x values we don't want to find or in other words want them to remain constant.
 
Last edited:
As I see it you can partition G based on the components of x that are fixed

so Gx=[G_1 G_2][v x_2] [\tex]

now p=G_1v is a constant vector and the problem becomes

min ||p-[G_2 M][x_2 y] || ... which is standard least square.

Hope the above makes sense. The multiplying vector in all the above expressions is a column vector .. though I have left out the transposes etc.
 
Hmmm i don't understand what u mean here. Thing is forget about constants and then solve the equation. Assuming let's say there are no constants then how do you solve the equation. Technically it should give me two normal equations.
 
If you forget about constants then there is nothing stopping me from choosing x=w=0 and the minimum of the function is 0.

Why do you say we should get two normal equations?

Also, what in particular didn't make sense in the previous post?
 
Why choose zeor for w and x. We have to minimize the equation for w and x and as far as i know the way was to take derivative with respect to each unknown vector and put it equal to zero and then solve the equation. This should give two normal equations. First take derivative with respect to x and put equal to zero and then take derivative with respect to w and put equal to zero. Thats what i think.

I actually was looking for an answer for a generalized case. So that is why i am saying that there could be any arbitrary values for G and M and values of w and x are to be found so you cannot just assume that they have zero value.
 
What I was trying to say was that the function to be minimized is the second norm of a vector so its smallest possible value is zero. Now if we do not have the constant component constraints then we can choose x=w=0 and achieve this minimum. So (0,0) is our solution.

Now if we have the constant constraints then the above posted solution seems to be completely general. The only thing I have done is taken the variable components of x (named it x_2) and y and made them a single vector of a higher dimension. And the problem reduces to a standard least square in this higher dimensional space.
 
Last edited:
Basically I am saying that the standard least square which is

min || Ax-b || can also be written as

min || A1 x1 + A2 x2 -b || where x1 and x2 are achieved by partitioning the vector x and A1 and A2 are corresponding partitions of columns of A.
 
Back
Top