Mysterious orthogonal complement formulas

jostpuur
Messages
2,112
Reaction score
19
This is the problem: Suppose A\in\mathbb{R}^{n\times k} is some matrix such that its vertical rows are linearly independent, and 1\leq k\leq n. I want to find a matrix B\in\mathbb{R}^{m\times n} such that n-k\leq m\leq n, its elements can nicely be computed from the A, and such that the horizontal rows of B span the orthogonal complement of k-dimensional space spanned by the vertical rows of A. This means that
<br /> BA = 0 \in\mathbb{R}^{m\times k}<br />
too.

There always exists a B\in\mathbb{R}^{(n-k)\times n} such that BA=0, and such that the horizontal rows of B are linearly independent, but its elements don't seem to always have a nice formulas.

Sorry, my problem is not well defined, because I don't know what "computing nicely" means. But this is interesting problem anyway, I insist.

Easy example: n=2, k=1. Set
<br /> B = (-A_{21}, A_{11}).<br />
Then
<br /> B\left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\<br /> \end{array}\right) = 0<br />

Another easy example: n=3, k=2. Set
<br /> B = (A_{21}A_{32} - A_{31}A_{22},\; A_{31}A_{12} - A_{11}A_{32},\; A_{11}A_{22} - A_{21}A_{12}).<br />
Then
<br /> B\left(\begin{array}{cc}<br /> A_{11} &amp; A_{12} \\<br /> A_{21} &amp; A_{22} \\<br /> A_{31} &amp; A_{32} \\<br /> \end{array}\right) = (0,0)<br />

A difficult example! n=3, k=1. What do you do now? We would like to get this:
<br /> \left(\begin{array}{ccc}<br /> B_{11} &amp; B_{12} &amp; B_{13} \\<br /> B_{21} &amp; B_{22} &amp; B_{23} \\<br /> \end{array}\right)<br /> \left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\ A_{31} \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> 0 \\ 0 \\<br /> \end{array}\right)<br />

If you find formulas for elements of B, which are not awfully complicated, I'll be surprised.

Here's an interesting matrix:

<br /> B = \left(\begin{array}{ccc}<br /> A_{21} &amp; -A_{11} &amp; 0 \\<br /> 0 &amp; A_{31} &amp; -A_{21} \\<br /> -A_{31} &amp; 0 &amp; A_{11} \\<br /> \end{array}\right)<br />

This matrix has the property, that its three horizontal rows always span the two dimensional orthogonal complement of A_{*1}. It can happen in two different ways. It can be that all B_{i*} are non-zero, and they are linearly dependent, or it can be, that one of the B_{i*} is zero, and two other ones are linearly independent.

That's an interesting remark! It is difficult to come up with a formula for two vectors that would span the two dimensional orthogonal complement, but it is easy to come up with a formula for three vectors that span the two dimensional orthogonal complement!

What happens with larger matrices A? Are we going to get some interesting function (n,k)\mapsto m(n,k) that tells how many vectors we need to span the n-k-dimensional complement "nicely"?
 
Last edited:
Physics news on Phys.org
How do you generalize the cross product?

People always say that the exterior product generalizes the cross product, but to me it seems it only generalizes half of the cross product.

Cross product tells two things: First one is the surface of the spanned quadrilateral. Second one is the orthogonal complement of the two dimensional spanned space.

Can somebody tell how to use exterior algebra to find an (n-k)-dimensional orthogonal complement of an k-dimensional space?
 
jostpuur said:
Can somebody tell how to use exterior algebra to find an (n-k)-dimensional orthogonal complement of an k-dimensional space?

The Hodge dual does exactly that, by mapping k-forms to (n-k)-forms. Hence if \alpha, \beta are 1-forms, then

\gamma = \star \, (\alpha \wedge \beta)
is an (n-2)-form orthogonal to the parallelogram defined by \alpha, \beta, whose magnitude is the area enclosed by that parallelogram.
 
I have seen the definition of Hodge dual, but I have never understood what it is all about.

If a define a linear form \omega:\mathbb{R}^3\to\mathbb{R} with formula

<br /> \omega = \omega_i e_i^T = \omega_1 e_1^T + \omega_2 e_2^T + \omega_3 e_3^T<br />

then its Hodge dual with respect to the metric g=\textrm{id}_{3\times 3} is

<br /> *\omega = (*\omega)_{ij} e_i^T\otimes e_j^T<br />

with coefficients

<br /> ((*\omega)_{ij})_{1\leq i,j,\leq 3} = \left(\begin{array}{ccc}<br /> 0 &amp; \omega_3 &amp; -\omega_2 \\<br /> -\omega_3 &amp; 0 &amp; \omega_1 \\<br /> \omega_2 &amp; -\omega_1 &amp; 0 \\<br /> \end{array}\right)<br />

So calculating the coefficients of the Hodge dual did not give me two vectors that would have spanned the orthogonal complement of given one dimensional space. But apparently it did give me a set of three vectors, which span the two dimensional orthogonal complement... :rolleyes:
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top