Mysterious orthogonal complement formulas

In summary, the conversation discusses finding a matrix B that satisfies certain conditions based on a given matrix A. Various examples are given to illustrate how B can be computed, and the concept of using exterior algebra and the Hodge dual to find the orthogonal complement of a given space is mentioned. The conversation ends with a discussion about the difficulty of finding a formula for two vectors that span the orthogonal complement, but the ease of finding a formula for three vectors that span it. There is also mention of a possible function that maps the dimensions of A to the number of vectors needed to span the complement.
  • #1
jostpuur
2,116
19
This is the problem: Suppose [itex]A\in\mathbb{R}^{n\times k}[/itex] is some matrix such that its vertical rows are linearly independent, and [itex]1\leq k\leq n[/itex]. I want to find a matrix [itex]B\in\mathbb{R}^{m\times n}[/itex] such that [itex]n-k\leq m\leq n[/itex], its elements can nicely be computed from the [itex]A[/itex], and such that the horizontal rows of [itex]B[/itex] span the orthogonal complement of [itex]k[/itex]-dimensional space spanned by the vertical rows of [itex]A[/itex]. This means that
[tex]
BA = 0 \in\mathbb{R}^{m\times k}
[/tex]
too.

There always exists a [itex]B\in\mathbb{R}^{(n-k)\times n}[/itex] such that [itex]BA=0[/itex], and such that the horizontal rows of [itex]B[/itex] are linearly independent, but its elements don't seem to always have a nice formulas.

Sorry, my problem is not well defined, because I don't know what "computing nicely" means. But this is interesting problem anyway, I insist.

Easy example: [itex]n=2, k=1[/itex]. Set
[tex]
B = (-A_{21}, A_{11}).
[/tex]
Then
[tex]
B\left(\begin{array}{c}
A_{11} \\ A_{21} \\
\end{array}\right) = 0
[/tex]

Another easy example: [itex]n=3, k=2[/itex]. Set
[tex]
B = (A_{21}A_{32} - A_{31}A_{22},\; A_{31}A_{12} - A_{11}A_{32},\; A_{11}A_{22} - A_{21}A_{12}).
[/tex]
Then
[tex]
B\left(\begin{array}{cc}
A_{11} & A_{12} \\
A_{21} & A_{22} \\
A_{31} & A_{32} \\
\end{array}\right) = (0,0)
[/tex]

A difficult example! [itex]n=3, k=1[/itex]. What do you do now? We would like to get this:
[tex]
\left(\begin{array}{ccc}
B_{11} & B_{12} & B_{13} \\
B_{21} & B_{22} & B_{23} \\
\end{array}\right)
\left(\begin{array}{c}
A_{11} \\ A_{21} \\ A_{31} \\
\end{array}\right)
= \left(\begin{array}{c}
0 \\ 0 \\
\end{array}\right)
[/tex]

If you find formulas for elements of [itex]B[/itex], which are not awfully complicated, I'll be surprised.

Here's an interesting matrix:

[tex]
B = \left(\begin{array}{ccc}
A_{21} & -A_{11} & 0 \\
0 & A_{31} & -A_{21} \\
-A_{31} & 0 & A_{11} \\
\end{array}\right)
[/tex]

This matrix has the property, that its three horizontal rows always span the two dimensional orthogonal complement of [itex]A_{*1}[/itex]. It can happen in two different ways. It can be that all [itex]B_{i*}[/itex] are non-zero, and they are linearly dependent, or it can be, that one of the [itex]B_{i*}[/itex] is zero, and two other ones are linearly independent.

That's an interesting remark! It is difficult to come up with a formula for two vectors that would span the two dimensional orthogonal complement, but it is easy to come up with a formula for three vectors that span the two dimensional orthogonal complement!

What happens with larger matrices [itex]A[/itex]? Are we going to get some interesting function [itex](n,k)\mapsto m(n,k)[/itex] that tells how many vectors we need to span the [itex]n-k[/itex]-dimensional complement "nicely"?
 
Last edited:
Physics news on Phys.org
  • #2
How do you generalize the cross product?

People always say that the exterior product generalizes the cross product, but to me it seems it only generalizes half of the cross product.

Cross product tells two things: First one is the surface of the spanned quadrilateral. Second one is the orthogonal complement of the two dimensional spanned space.

Can somebody tell how to use exterior algebra to find an (n-k)-dimensional orthogonal complement of an k-dimensional space?
 
  • #3
jostpuur said:
Can somebody tell how to use exterior algebra to find an (n-k)-dimensional orthogonal complement of an k-dimensional space?

The Hodge dual does exactly that, by mapping k-forms to (n-k)-forms. Hence if [itex]\alpha, \beta[/itex] are 1-forms, then

[tex]\gamma = \star \, (\alpha \wedge \beta)[/tex]
is an (n-2)-form orthogonal to the parallelogram defined by [itex]\alpha, \beta[/itex], whose magnitude is the area enclosed by that parallelogram.
 
  • #4
I have seen the definition of Hodge dual, but I have never understood what it is all about.

If a define a linear form [itex]\omega:\mathbb{R}^3\to\mathbb{R}[/itex] with formula

[tex]
\omega = \omega_i e_i^T = \omega_1 e_1^T + \omega_2 e_2^T + \omega_3 e_3^T
[/tex]

then its Hodge dual with respect to the metric [itex]g=\textrm{id}_{3\times 3}[/itex] is

[tex]
*\omega = (*\omega)_{ij} e_i^T\otimes e_j^T
[/tex]

with coefficients

[tex]
((*\omega)_{ij})_{1\leq i,j,\leq 3} = \left(\begin{array}{ccc}
0 & \omega_3 & -\omega_2 \\
-\omega_3 & 0 & \omega_1 \\
\omega_2 & -\omega_1 & 0 \\
\end{array}\right)
[/tex]

So calculating the coefficients of the Hodge dual did not give me two vectors that would have spanned the orthogonal complement of given one dimensional space. But apparently it did give me a set of three vectors, which span the two dimensional orthogonal complement... :rolleyes:
 
  • #5


This is a fascinating problem and one that has potential applications in various fields of mathematics and engineering. It seems that there is no simple formula for finding the matrix B in all cases, but there are some interesting patterns emerging. Further research and experimentation with different matrices A could potentially lead to the development of a general formula or algorithm for finding B for any given A. Additionally, investigating the properties of the matrix B could provide insight into the structure and behavior of orthogonal complements in higher dimensions. This is definitely an interesting and challenging problem that warrants further investigation.
 

1. What is an orthogonal complement?

An orthogonal complement is a vector space that is perpendicular to another vector space. In other words, it consists of all vectors that are orthogonal (perpendicular) to every vector in the original space.

2. How is the orthogonal complement represented mathematically?

The orthogonal complement of a vector space V is denoted as V.

3. What is the significance of finding the orthogonal complement?

Finding the orthogonal complement is useful in many areas of mathematics, including linear algebra, functional analysis, and signal processing. It allows for the decomposition of a vector space into two independent subspaces, making it easier to solve problems and make calculations.

4. What are some common formulas involving the orthogonal complement?

Some common formulas involving the orthogonal complement include the projection formula (which allows for the projection of a vector onto a subspace), the Gram-Schmidt process (which finds an orthonormal basis for a vector space), and the orthogonal decomposition formula (which decomposes a vector into its orthogonal components).

5. How is the orthogonal complement used in real-world applications?

The orthogonal complement has many practical applications, such as in image and video compression, where it is used to reduce the amount of data needed to represent an image or video. It is also used in signal processing to remove noise from signals and in computer graphics to create 3D models and animations. Additionally, the orthogonal complement is used in physics and engineering to solve problems involving forces and vectors.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
779
  • Linear and Abstract Algebra
Replies
12
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
974
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
4K
Back
Top