Mysterious orthogonal complement formulas

  • Context: Graduate 
  • Thread starter Thread starter jostpuur
  • Start date Start date
  • Tags Tags
    Formulas Orthogonal
Click For Summary

Discussion Overview

The discussion revolves around finding matrices that span the orthogonal complement of a k-dimensional space defined by the vertical rows of a given matrix A. Participants explore various examples and formulations, including the use of exterior algebra and the Hodge dual, while grappling with the complexity of deriving "nice" formulas for such matrices.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant presents a problem involving a matrix A with linearly independent vertical rows and seeks a matrix B whose horizontal rows span the orthogonal complement of the space spanned by A's rows, noting the challenge of finding "nice" formulas for B's elements.
  • Examples are provided to illustrate specific cases, including matrices for n=2, k=1 and n=3, k=2, demonstrating how B can be constructed to satisfy the orthogonality condition.
  • A participant questions how to generalize the cross product and relates it to the exterior product, expressing a desire to understand how to find an (n-k)-dimensional orthogonal complement using exterior algebra.
  • Another participant mentions the Hodge dual as a method to find the orthogonal complement, providing a formula involving k-forms and their relation to (n-k)-forms.
  • Discussion includes a participant's struggle to understand the Hodge dual and its application, noting that it produces a set of vectors that span the orthogonal complement rather than a single vector.

Areas of Agreement / Disagreement

Participants express differing views on the effectiveness of various methods for finding orthogonal complements, particularly regarding the use of the Hodge dual and exterior algebra. The discussion remains unresolved with multiple competing approaches being explored.

Contextual Notes

Participants acknowledge the complexity of defining "nice" formulas for the matrices and the challenges associated with generalizing results to larger matrices. There is an emphasis on the dependence of results on specific definitions and assumptions related to linear independence and dimensionality.

jostpuur
Messages
2,112
Reaction score
19
This is the problem: Suppose A\in\mathbb{R}^{n\times k} is some matrix such that its vertical rows are linearly independent, and 1\leq k\leq n. I want to find a matrix B\in\mathbb{R}^{m\times n} such that n-k\leq m\leq n, its elements can nicely be computed from the A, and such that the horizontal rows of B span the orthogonal complement of k-dimensional space spanned by the vertical rows of A. This means that
<br /> BA = 0 \in\mathbb{R}^{m\times k}<br />
too.

There always exists a B\in\mathbb{R}^{(n-k)\times n} such that BA=0, and such that the horizontal rows of B are linearly independent, but its elements don't seem to always have a nice formulas.

Sorry, my problem is not well defined, because I don't know what "computing nicely" means. But this is interesting problem anyway, I insist.

Easy example: n=2, k=1. Set
<br /> B = (-A_{21}, A_{11}).<br />
Then
<br /> B\left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\<br /> \end{array}\right) = 0<br />

Another easy example: n=3, k=2. Set
<br /> B = (A_{21}A_{32} - A_{31}A_{22},\; A_{31}A_{12} - A_{11}A_{32},\; A_{11}A_{22} - A_{21}A_{12}).<br />
Then
<br /> B\left(\begin{array}{cc}<br /> A_{11} &amp; A_{12} \\<br /> A_{21} &amp; A_{22} \\<br /> A_{31} &amp; A_{32} \\<br /> \end{array}\right) = (0,0)<br />

A difficult example! n=3, k=1. What do you do now? We would like to get this:
<br /> \left(\begin{array}{ccc}<br /> B_{11} &amp; B_{12} &amp; B_{13} \\<br /> B_{21} &amp; B_{22} &amp; B_{23} \\<br /> \end{array}\right)<br /> \left(\begin{array}{c}<br /> A_{11} \\ A_{21} \\ A_{31} \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> 0 \\ 0 \\<br /> \end{array}\right)<br />

If you find formulas for elements of B, which are not awfully complicated, I'll be surprised.

Here's an interesting matrix:

<br /> B = \left(\begin{array}{ccc}<br /> A_{21} &amp; -A_{11} &amp; 0 \\<br /> 0 &amp; A_{31} &amp; -A_{21} \\<br /> -A_{31} &amp; 0 &amp; A_{11} \\<br /> \end{array}\right)<br />

This matrix has the property, that its three horizontal rows always span the two dimensional orthogonal complement of A_{*1}. It can happen in two different ways. It can be that all B_{i*} are non-zero, and they are linearly dependent, or it can be, that one of the B_{i*} is zero, and two other ones are linearly independent.

That's an interesting remark! It is difficult to come up with a formula for two vectors that would span the two dimensional orthogonal complement, but it is easy to come up with a formula for three vectors that span the two dimensional orthogonal complement!

What happens with larger matrices A? Are we going to get some interesting function (n,k)\mapsto m(n,k) that tells how many vectors we need to span the n-k-dimensional complement "nicely"?
 
Last edited:
Physics news on Phys.org
How do you generalize the cross product?

People always say that the exterior product generalizes the cross product, but to me it seems it only generalizes half of the cross product.

Cross product tells two things: First one is the surface of the spanned quadrilateral. Second one is the orthogonal complement of the two dimensional spanned space.

Can somebody tell how to use exterior algebra to find an (n-k)-dimensional orthogonal complement of an k-dimensional space?
 
jostpuur said:
Can somebody tell how to use exterior algebra to find an (n-k)-dimensional orthogonal complement of an k-dimensional space?

The Hodge dual does exactly that, by mapping k-forms to (n-k)-forms. Hence if \alpha, \beta are 1-forms, then

\gamma = \star \, (\alpha \wedge \beta)
is an (n-2)-form orthogonal to the parallelogram defined by \alpha, \beta, whose magnitude is the area enclosed by that parallelogram.
 
I have seen the definition of Hodge dual, but I have never understood what it is all about.

If a define a linear form \omega:\mathbb{R}^3\to\mathbb{R} with formula

<br /> \omega = \omega_i e_i^T = \omega_1 e_1^T + \omega_2 e_2^T + \omega_3 e_3^T<br />

then its Hodge dual with respect to the metric g=\textrm{id}_{3\times 3} is

<br /> *\omega = (*\omega)_{ij} e_i^T\otimes e_j^T<br />

with coefficients

<br /> ((*\omega)_{ij})_{1\leq i,j,\leq 3} = \left(\begin{array}{ccc}<br /> 0 &amp; \omega_3 &amp; -\omega_2 \\<br /> -\omega_3 &amp; 0 &amp; \omega_1 \\<br /> \omega_2 &amp; -\omega_1 &amp; 0 \\<br /> \end{array}\right)<br />

So calculating the coefficients of the Hodge dual did not give me two vectors that would have spanned the orthogonal complement of given one dimensional space. But apparently it did give me a set of three vectors, which span the two dimensional orthogonal complement... :rolleyes:
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 27 ·
Replies
27
Views
3K