How to minimize a simple quadratic function of multiple variables ?

In summary, the conversation is about minimizing a function with multiple variables, specifically approximating a matrix using the outer product of two vectors. The goal is to determine a vector y and a vector x that minimizes the sum of squared differences between the outer product and the given matrix. While it is possible to solve this using gradient descent, the speaker is looking for an analytical solution but is unsure of how to approach the problem due to the complexity of variables. They compare it to linear regression and suggest looking into that method for guidance.
  • #1
darwid
2
0
Hi everybody,

I'm trying to minimize a function with multiple variables. My goal is to approximate on the L2 norm a matrix by the outer product of 2 vectors (or is it called tensor product ?).

So I have to determine a vector y = (y1,...,yn) and a vector x = (x1,...,xm) such that their outer product approximates a given matrix A = (ai,j), i=1..n, j=1..m

What I want to minimize is thus:
s = [tex]\sum[/tex](yixj-ai,j)2

Obviously I can solve this using a gradient descent and it works.

But what I'm looking for is an analytical solution. The formulation looks simple so I expect there must be some analytical way of solving this, it's just that I don't really know how to approach this problem due to the many variables.

--
Darwid
 
Physics news on Phys.org
  • #2
Back
Top