Dimension of orthogonal subspaces sum

Click For Summary

Homework Help Overview

The discussion revolves around the dimension of the sum of two orthogonal subspaces in a vector space, specifically examining the linear independence of their basis vectors and the implications for the dimension of the combined space.

Discussion Character

  • Conceptual clarification, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants explore the linear independence of basis vectors from two orthogonal subspaces and the conditions under which the dimension of their sum equals the sum of their individual dimensions. Questions arise regarding the application of the triangle inequality in this context and the notation used for coefficients in vector expansions.

Discussion Status

Some participants express confidence in the proof's soundness while seeking clarification on specific points, such as the triangle inequality and the notation for coefficients. There is acknowledgment of potential typographical errors in referenced materials, indicating an ongoing exploration of the topic.

Contextual Notes

Participants note the importance of distinguishing coefficients in vector expansions and question the relevance of the triangle inequality to the proof being discussed. There is mention of a specific theorem related to vector space dimensions that may clarify the hint provided.

Virgileo
Messages
2
Reaction score
0
Homework Statement
Suppose ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2## are two subspaces such that any element of ##\mathbb{V}_1## is orthogonal to any element of ##\mathbb{V}_2##. Show that the dimensionality of ##\mathbb{V}_1 \oplus \mathbb{V}_2## is ##n_1+n_2##. (Hint: the triangle inequality).
Relevant Equations
The triangle inequality:
$$|V + W| \leq |V| + |W|$$

Inner product and the norm:
$$\langle V | W \rangle = \sum_{i=1}^n v_i w_i$$, where ##v_i## and ##w_i## are components of vectors ##V## and ##W## in some orthonormal basis ##{| u_i \rangle}##.
$$|V|^2 = \langle V | V \rangle$$

Overall this question is from Shankar's quantum mechanics book and he uses Dirac notation from the beginning to tell the linear algebra needed for the course, so I would use this notation too.
##| V_1 \rangle \in \mathbb{V}^{n_1}_1## and there is an orthonormal basis in ##\mathbb{V}^{n_1}_1##: ##|u_1\rangle, |u_2\rangle ... |u_{n_1}\rangle##
##| V_2 \rangle \in \mathbb{V}^{n_2}_2## and there is an orthonormal basis in ##\mathbb{V}^{n_2}_2##: ##|w_1\rangle, |w_2\rangle ... |w_{n_2}\rangle##
The existence of such orthonormal basis is ensured by Gram-Schmidt theorem.

##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
##|V\rangle \in \mathbb{V} = \mathbb{V}_1 \oplus \mathbb{V}_2##
##|V\rangle = |V_1\rangle + |V_2\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle## (1)

Each sum in equation (1) is formed from linearly independent vectors (because they are the multiples of basis vectors). We need to show that together both sets of vectors from the sum are linearly independent with each other - that will ensure that the dimension of the sum space will be at least ##n_1+n_2##. And it can't be bigger than ##n_1+n_2##, because all vectors from ##\mathbb{V}## can be represented with equation (1) by the definition of subspace sum, and so there can't be an additional vector that is linearly independent with all the others and is used in component representation of some vector ##|V\rangle##.

To show the linear independence of two orthogonal basis, that also have the property of each element of one to be orthogonal with all elements of the other the linear combination needs to be written and shown to be trivial:
##\sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle = 0##
Apply inner product to both sides with all ##\langle u_i |## and ##\langle w_j |## we will get correspondingly:
##|v_i\rangle = 0## and ##|v_j\rangle = 0## - for same subspace elements due to all the other vectors being orthogonal as members of orthonormal basis and for foreign subspace elements due to all other vectors being orthogonal by the property of ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##.

Thus the dimension of the sum of subspaces is exactly ##n_1+n_2##.

This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
 
Physics news on Phys.org
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.
 
  • Like
Likes   Reactions: Virgileo
Virgileo said:
##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
 
  • Like
Likes   Reactions: Virgileo
Office_Shredder said:
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.

Yeah, they are indeed just numbers. About the triangle inequality maybe it is something to do with the fact that if we take a vector which has component representation in an orthonormal basis of all ones:
$$\begin{bmatrix}
1 \\
1 \\
... \\
1
\end{bmatrix}$$
Then the inner product of this vector with itself will yield dimension of the space. I was thinking maybe to take vectors ##|V_1\rangle## and ##|V_2\rangle## as such vectors, and then look the inner product of their sum with itself. But after playing with it it still lead me nowhere...

Anyway, thanks for checking.

vela said:
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
Yeah you are right, I was being lazy with this notation...
 
Virgileo said:
This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
I took a look in the second edition of Shankar I found online. I think it was just a typo. The hint should have referred to theorem 4, which has to do with the dimension of a vector space.
 
  • Like
Likes   Reactions: Virgileo

Similar threads

  • · Replies 45 ·
2
Replies
45
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
34
Views
3K
Replies
15
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K