Orthogonal Complement to the Kernel of a Linear Transformation

In summary, S is the set of all equivalence classes under the kernel of A. However, when y is fixed, the set S does not depend on y.
  • #1
Kreizhn
743
1
Hey all,

I'm trying to find an orthogonal complement (under the standard inner product) to a space, and I think I've found the result mathematically. Unfortunately, when I apply the result to a toy example it seems to fail.

Assume that [itex] A \in M_{m\times n}(\mathbb R^n), y \in \mathbb R^n[/itex] and define the space [itex] S = \left\{ x \in \mathbb R^n : A(x-y) = 0 \right\} [/itex]. My goal is to characterize [itex] S^\perp [/itex].

I performed the following calculation

[tex] \begin{align*}
S &= \left\{ x \in \mathbb R^n : A(x-y) = 0 \right\} \\
&= \left\{ x \in \mathbb R^n : x-y \in \ker A \right\} \\
&= \mathbb R^n/ \ker A
\end{align*}
[/tex]

where I've used the fact that [itex] x-y \in \ker A [/itex] defines an equivalence relation to turn [itex] \mathbb R^n/\ker A [/itex] into a quotient space. In particular, since [itex] \ker A [/itex] is a closed linear subspace and [itex] \mathbb R^n [/itex] is a Hilbert space in the standard inner product, we must have that

[tex] \mathbb R^n /\ker A \cong (\ker A )^\perp [/itex]
and since the orthogonal complement is "reflexive" in finite dimensions, we conclude that
[tex] \left( \mathbb R^n/\ker A \right)^\perp \cong \ker A [/tex]

However, this does not seem to produce a correct result. I've checked my work and the only place I can possibly see an error is that even when the orthogonal complement is reflexive, perhaps
[tex] A \cong B^\perp \not\Rightarrow A^\perp \cong B [/tex]?

Alternatively, I've also calculated that
[tex] S^\perp = \text{Row}(A) \cap \text{span}\{y\} [/tex]
but this is far less useful than a simple result.

Anyway, I tried this on the toy example

[tex] A = \begin{pmatrix} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 1 & -1 & 0 \end{pmatrix}, y = \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix} [/tex]

Now [itex] \ker A = \left\{ (-t,t,t) : t \in \mathbb R \right\} [/itex]. Choosing an arbitrary representation with t=1, we would get a point [itex] x \in S [/itex] given by
[tex] x = \begin{pmatrix} -1 \\ -1 \\ 1 \end{pmatrix} + \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix} = \begin{pmatrix} 0 \\ 1 \\ 4 \end{pmatrix} [/tex]
which is certainly in S since [itex] x-y \in \ker A [/itex]. However, there is no nonzero [itex] t \in \mathbb R^n [/itex] such that [itex] (0, 1, 4) \cdot(-t,-t,t) = 0 [/itex] and so my result cannot be correct.

Can anyone see where I went wrong?
 
Physics news on Phys.org
  • #2
I think I may know what the problem is. Is it perhaps that [itex] \mathbb R^n/\ker A [/itex] is the set of all equivalence classes under the kernel of A, but we want only equivalence under y?
 
  • #3
Yeah your description for S is bunk. Obviously it needs to depend on y. S is the set y+ker(A)
 
  • #4
You are assuming that y is fixed aren't you? So perhaps, define the map
T:[tex]R^{n}[/tex] [tex]\rightarrow[/tex] [tex]R^{m}[/tex], such that T(x)= A(x-y). Thus, your set S = kerT.
Vignon S. Oussa
 
  • #5


Hi there,

It seems like your calculations are correct. However, there may be an issue with your toy example. In general, the orthogonal complement of a subspace S is not unique, so there may be multiple spaces that satisfy the conditions. It's possible that your calculated result is correct, but it just doesn't match the specific example you chose.

Additionally, it's worth considering that the orthogonal complement of a subspace S is defined as the set of all vectors that are orthogonal to every vector in S. So in your toy example, the vector (0,1,4) may not be orthogonal to every vector in S, even though it satisfies the condition of being orthogonal to the vectors in the kernel of A.

I would suggest trying a different example or checking your calculations with a different method to see if you get the same result. It's also helpful to keep in mind that the orthogonal complement is not unique, so there may be other spaces that satisfy the conditions. I hope this helps!
 

Related to Orthogonal Complement to the Kernel of a Linear Transformation

What is the definition of the orthogonal complement to the kernel of a linear transformation?

The orthogonal complement to the kernel of a linear transformation is the set of all vectors that are perpendicular to every vector in the kernel. In other words, it is the set of vectors that are orthogonal to the null space of the linear transformation.

How is the orthogonal complement to the kernel of a linear transformation related to the range of the transformation?

The orthogonal complement to the kernel of a linear transformation is the same as the range of the adjoint transformation. This means that any vector in the orthogonal complement is mapped to the zero vector by the linear transformation, and vice versa.

What is the significance of the orthogonal complement to the kernel of a linear transformation in applications?

The orthogonal complement to the kernel of a linear transformation is important in applications because it allows us to find a basis for the range of the transformation. This can be useful in solving systems of linear equations, as well as in other areas such as signal processing and data compression.

How can the orthogonal complement to the kernel of a linear transformation be calculated?

The orthogonal complement to the kernel of a linear transformation can be calculated by finding a basis for the kernel and then using the Gram-Schmidt process to find a basis for the orthogonal complement. Alternatively, it can be calculated using the null space and row space of the linear transformation.

What is the relationship between the dimension of the orthogonal complement to the kernel of a linear transformation and the dimension of the kernel?

The dimension of the orthogonal complement to the kernel of a linear transformation is equal to the dimension of the vector space minus the dimension of the kernel. This is known as the rank-nullity theorem and is an important result in linear algebra.

Similar threads

  • Linear and Abstract Algebra
2
Replies
52
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Replies
2
Views
860
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
881
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
994
  • Linear and Abstract Algebra
Replies
4
Views
998
Replies
31
Views
2K
Back
Top