
#1
Mar2812, 08:50 PM

P: 312

I have two matrices, A and D, with same numbers of rows and different numbers of columns (A has many more columns than D), I want to find x and y such that AxDy_2 is minimized. I.e., I want to find the closest vectors in span{A} and span{D}. Seems like a simple problem, but couldn't figure it out. Any suggestions? (A and D are linearly independent, so that span{A} and span{D} have no nonzero intersection)




#2
Mar2812, 10:25 PM

Sci Advisor
P: 3,177

First, let's try to state the problem clearly. Your statement about finding 'x' and 'y' isn't clear because it isn't clear whether "Ax" is supposed to represent a column vector or whether it represents the matrix "A" times a vector "x".
We could try it this way first: I have two sets of n dimensional vectors A and D. Set D has greater cardinality that set A. The span of set A and the span of set D are vector spaces whose only intersection is the zero vector. How do I find vectors x and y such that x is in the span of A and y is in the span of D and the distance between x and y (i.e. [itex]  x y_2 [/itex]) is minimal? The answer, of course, is to set both x and y equal to the zero vector. Assuming that's not what you want to do, how do we modify the statement of the problem to say what you want? 



#3
Mar2912, 12:33 PM

P: 312

Thank you for correcting the problem statement, following your statement what I want to minimize is is the angle between x and y, i.e., maximize [itex]\frac{<x,y>}{x_2y_2}[/itex], and I don't want the trivial 0 solution. Where do I go from here?




#4
Mar3012, 03:28 PM

Sci Advisor
P: 3,177

strange data fitting problemMy intuition is that if you have two vector subspaces that only intersect at the zero vector, then you should be able to find a set of vectors [itex] {e_1,e_2,..,e_n, f_1,f_2,...,f_m} [/itex] such that this set is a (nonorthogonal) basis for the parent n+m dimensional space, the [itex]e_i [/itex] are an orthonormal basis for the first subspace and the [itex] f_i [/itex] are an orthonormal basis for the second subspace. If that inutition is correct then let [itex]\hat{x} = \sum_1^n \alpha_i e_i [/itex] and [itex] \hat{y} = \sum_1^m \beta_j f_j [/itex]. Let [itex] c_{i,j} = <e_i, f_j> [/itex]. The problem is to maximize the function [itex] \sum_{i=1}^n \sum_{j=1}^m c_{i,j} \alpha_i \beta_j [/itex] subject to the constraints [itex] \sum_1^n \alpha_i^2 = 1 [/itex] and [itex] \sum_1^n \beta_j^2 = 1 [/itex]. I wonder if there is a simpler formulation. 



#5
Mar3012, 09:35 PM

P: 312

It seems that
[tex] <x,y>=(Aa)'(Db)=a'A'Db=a'U'SVb=(Ua)'S(Vb) [/tex] where [itex] A'D=U'SV[/itex] is the SVD, since both U and V are orthonormal, the minimum angle occurs at the largest singular value in S. Does that sound right? 



#6
Mar3012, 10:34 PM

P: 4,570




#7
Mar3012, 10:44 PM

P: 312





#8
Mar3012, 11:02 PM

P: 771





#9
Mar3012, 11:20 PM

P: 312





#10
Mar3112, 03:58 AM

Sci Advisor
P: 3,177

If [itex] A = \begin{pmatrix} 1 \\ 1 \\ 1 \end{pmatrix} [/itex], [itex] B = \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{pmatrix} [/itex] then [itex] A'D = \begin{pmatrix} 1 & 1 \end{pmatrix} [/itex]. [itex] A'D [/itex] is equal to the same thing if [itex] A = \begin{pmatrix} 1 \\ 1 \\ 2 \end{pmatrix} [/itex] [itex] B = \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{pmatrix} [/itex] 



#11
Mar3112, 06:09 PM

P: 312





#12
Mar3112, 10:46 PM

Sci Advisor
P: 3,177

In these two examples, do we have the same matrix for A'D but different answers for the maximum angle? (My 4D intuition isn't good, so I'm not sure.) Example 1: [itex] A = \begin{pmatrix} \frac{1}{\sqrt{3}} \\ \frac{1}{\sqrt{3}}\\ \frac{2}{\sqrt{15}}\\ \frac{1}{\sqrt{15}} \end{pmatrix} ,\ D = \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \\ 0 & 0 \end{pmatrix} [/itex] Example 2: [itex] A = \begin{pmatrix} \frac{1}{\sqrt{3}} \\ \frac{1}{\sqrt{3}}\\ \frac{1}{\sqrt{6}}\\ \frac{1}{\sqrt{6}} \end{pmatrix} ,\ D = \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \\ 0 & 0 \end{pmatrix} [/itex] 



#13
Apr112, 08:51 PM

P: 312





#14
Apr312, 11:23 PM

Sci Advisor
P: 3,177

[itex] A'D = \begin{pmatrix} 1 \end{pmatrix} [/itex] [itex] \begin{pmatrix}\frac{\sqrt{2}}{\sqrt{3}} & 0 \end{pmatrix} [/itex] [itex] \begin{pmatrix}\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}}\\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{pmatrix} [/itex] You have shown that <x,y> is determined by A'D. This isn't a result I have seen before. You haven't explained why the maximum possible <x,y> is equal to the largest singular value or why vectors a and b must exist that produce this value. (Are we maximizing <x,y> or maximizing the absolute value of <x,y>?) 



#15
Apr312, 11:43 PM

P: 4,570

Since you want to minimize AxDy_2 then just minimize <AxDy,AxDy> which is <AxDy,AxDy> = <Ax,Ax>  2<Ax,Dy> + <Dy,Dy> Now if x and y are vectors, Ax will be bilinear in each component as will Dy which means the whole thing will be a multilinear expression. Also minimizing the square of the norm is equivalent to minimizing the norm itself as both are purely increasing monotonic functions and since the answer is always greater than or equal to zero. 


Register to reply 
Related Discussions  
Fitting a model to a data  Precalculus Mathematics Homework  2  
Fitting 3D data to a Parabola  Calculus & Beyond Homework  18  
[MATLAB] 3D Data Fitting  Math & Science Software  6  
Problem with fitting simple quadratic function to 3 data points  Calculus  3  
data fitting  Programming & Computer Science  1 