jostpuur
- 2,112
- 19
This is the problem: Suppose A\in\mathbb{R}^{n\times k} is some matrix such that its vertical rows are linearly independent, and 1\leq k\leq n. I want to find a matrix B\in\mathbb{R}^{m\times n} such that n-k\leq m\leq n, its elements can nicely be computed from the A, and such that the horizontal rows of B span the orthogonal complement of k-dimensional space spanned by the vertical rows of A. This means that
<br /> BA = 0 \in\mathbb{R}^{m\times k}<br />
too.
There always exists a B\in\mathbb{R}^{(n-k)\times n} such that BA=0, and such that the horizontal rows of B are linearly independent, but its elements don't seem to always have a nice formulas.
Sorry, my problem is not well defined, because I don't know what "computing nicely" means. But this is interesting problem anyway, I insist.
Easy example: n=2, k=1. Set
<br /> B = (-A_{21}, A_{11}).<br />
Then
<br /> B\left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\<br /> \end{array}\right) = 0<br />
Another easy example: n=3, k=2. Set
<br /> B = (A_{21}A_{32} - A_{31}A_{22},\; A_{31}A_{12} - A_{11}A_{32},\; A_{11}A_{22} - A_{21}A_{12}).<br />
Then
<br /> B\left(\begin{array}{cc}<br /> A_{11} & A_{12} \\<br /> A_{21} & A_{22} \\<br /> A_{31} & A_{32} \\<br /> \end{array}\right) = (0,0)<br />
A difficult example! n=3, k=1. What do you do now? We would like to get this:
<br /> \left(\begin{array}{ccc}<br /> B_{11} & B_{12} & B_{13} \\<br /> B_{21} & B_{22} & B_{23} \\<br /> \end{array}\right)<br /> \left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\ A_{31} \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> 0 \\ 0 \\<br /> \end{array}\right)<br />
If you find formulas for elements of B, which are not awfully complicated, I'll be surprised.
Here's an interesting matrix:
<br /> B = \left(\begin{array}{ccc}<br /> A_{21} & -A_{11} & 0 \\<br /> 0 & A_{31} & -A_{21} \\<br /> -A_{31} & 0 & A_{11} \\<br /> \end{array}\right)<br />
This matrix has the property, that its three horizontal rows always span the two dimensional orthogonal complement of A_{*1}. It can happen in two different ways. It can be that all B_{i*} are non-zero, and they are linearly dependent, or it can be, that one of the B_{i*} is zero, and two other ones are linearly independent.
That's an interesting remark! It is difficult to come up with a formula for two vectors that would span the two dimensional orthogonal complement, but it is easy to come up with a formula for three vectors that span the two dimensional orthogonal complement!
What happens with larger matrices A? Are we going to get some interesting function (n,k)\mapsto m(n,k) that tells how many vectors we need to span the n-k-dimensional complement "nicely"?
<br /> BA = 0 \in\mathbb{R}^{m\times k}<br />
too.
There always exists a B\in\mathbb{R}^{(n-k)\times n} such that BA=0, and such that the horizontal rows of B are linearly independent, but its elements don't seem to always have a nice formulas.
Sorry, my problem is not well defined, because I don't know what "computing nicely" means. But this is interesting problem anyway, I insist.
Easy example: n=2, k=1. Set
<br /> B = (-A_{21}, A_{11}).<br />
Then
<br /> B\left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\<br /> \end{array}\right) = 0<br />
Another easy example: n=3, k=2. Set
<br /> B = (A_{21}A_{32} - A_{31}A_{22},\; A_{31}A_{12} - A_{11}A_{32},\; A_{11}A_{22} - A_{21}A_{12}).<br />
Then
<br /> B\left(\begin{array}{cc}<br /> A_{11} & A_{12} \\<br /> A_{21} & A_{22} \\<br /> A_{31} & A_{32} \\<br /> \end{array}\right) = (0,0)<br />
A difficult example! n=3, k=1. What do you do now? We would like to get this:
<br /> \left(\begin{array}{ccc}<br /> B_{11} & B_{12} & B_{13} \\<br /> B_{21} & B_{22} & B_{23} \\<br /> \end{array}\right)<br /> \left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\ A_{31} \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> 0 \\ 0 \\<br /> \end{array}\right)<br />
If you find formulas for elements of B, which are not awfully complicated, I'll be surprised.
Here's an interesting matrix:
<br /> B = \left(\begin{array}{ccc}<br /> A_{21} & -A_{11} & 0 \\<br /> 0 & A_{31} & -A_{21} \\<br /> -A_{31} & 0 & A_{11} \\<br /> \end{array}\right)<br />
This matrix has the property, that its three horizontal rows always span the two dimensional orthogonal complement of A_{*1}. It can happen in two different ways. It can be that all B_{i*} are non-zero, and they are linearly dependent, or it can be, that one of the B_{i*} is zero, and two other ones are linearly independent.
That's an interesting remark! It is difficult to come up with a formula for two vectors that would span the two dimensional orthogonal complement, but it is easy to come up with a formula for three vectors that span the two dimensional orthogonal complement!
What happens with larger matrices A? Are we going to get some interesting function (n,k)\mapsto m(n,k) that tells how many vectors we need to span the n-k-dimensional complement "nicely"?
Last edited: