Least Square Estimator for Matrices: Bill's Problem

  • Thread starter Thread starter bhobba
  • Start date Start date
  • Tags Tags
    Square
Messages
10,907
Reaction score
3,782
I recently came across the following interesting problem.

Suppose A = BC where A,B, and C are matrices. We know a ton of A's and their corresponding C's. We want the least square estimator of B.

When A and C are vectors the solution is well known.

But what is the solution when they are matrices?

Thanks
Bill
 
Physics news on Phys.org
How do you intend to define the "error" between the observed and predicted values? Until that is defined, "least squares" doesn't describe a specific criteria.
 
The matrix L2 norm ie the B that minimises ∑ ||Ai - BCi||^2 where Ai and Ci are the known A C matrix pairs. By matrix L2 norm I mean the generalisation of the usual vector norm ie the square root of the sum of the squares of the matrix elements.

It grew out of the following paper:
http://www.cv-foundation.org/openac...t_Direct_Super-Resolution_2013_ICCV_paper.pdf

See equation 2.

Thanks
Bill
 
Isn't there an equivalent of a perp projection operator in your space of matrices ?If this space is a Hilbert space, then, AFAIK, the general solution to this problem in a Hilbert space is the ortho. projection of B onto the subspace spanned by A,C.
 
WWGD said:
Isn't there an equivalent of a perp projection operator in your space of matrices ?If this space is a Hilbert space, then, AFAIK, the general solution to this problem in a Hilbert space is the ortho. projection of B onto the subspace spanned by A,C.

Yes there is - its the trace. I will think about that one.

Thanks
Bill
 
Hi Guys

Thanks for all the help.

Finally nutted it out. As usual I was on the wrong track. It's simply a matter by blocking the problem and reducing it to a number of ordinary least squares problems. Break B into rows Bj so you get the usual least squares problems ||Aji - BjCi||^2. The minimum is the minimum of each of these separate problems.

Thanks
Bill
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top