Dimension of orthogonal subspaces sum

Click For Summary
SUMMARY

The discussion centers on the dimension of the sum of two orthogonal subspaces, ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##, and the proof that the dimension of their sum is exactly ##n_1+n_2##. The proof utilizes the Gram-Schmidt theorem to establish orthonormal bases, and demonstrates the linear independence of the combined basis vectors through inner products. Participants also explore the relevance of the triangle inequality in this context, suggesting a connection to the dimension of vector spaces as outlined in Shankar's second edition.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically orthogonal subspaces
  • Familiarity with the Gram-Schmidt theorem for constructing orthonormal bases
  • Knowledge of inner product spaces and their properties
  • Basic comprehension of vector space dimensions and their implications
NEXT STEPS
  • Study the Gram-Schmidt process in detail to construct orthonormal bases
  • Learn about inner product spaces and their applications in linear algebra
  • Research the triangle inequality and its implications in vector spaces
  • Examine theorems related to the dimension of vector spaces, particularly those referenced in Shankar's texts
USEFUL FOR

Mathematicians, physics students, and anyone studying linear algebra or vector spaces, particularly those interested in the properties of orthogonal subspaces and their dimensions.

Virgileo
Messages
2
Reaction score
0
Homework Statement
Suppose ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2## are two subspaces such that any element of ##\mathbb{V}_1## is orthogonal to any element of ##\mathbb{V}_2##. Show that the dimensionality of ##\mathbb{V}_1 \oplus \mathbb{V}_2## is ##n_1+n_2##. (Hint: the triangle inequality).
Relevant Equations
The triangle inequality:
$$|V + W| \leq |V| + |W|$$

Inner product and the norm:
$$\langle V | W \rangle = \sum_{i=1}^n v_i w_i$$, where ##v_i## and ##w_i## are components of vectors ##V## and ##W## in some orthonormal basis ##{| u_i \rangle}##.
$$|V|^2 = \langle V | V \rangle$$

Overall this question is from Shankar's quantum mechanics book and he uses Dirac notation from the beginning to tell the linear algebra needed for the course, so I would use this notation too.
##| V_1 \rangle \in \mathbb{V}^{n_1}_1## and there is an orthonormal basis in ##\mathbb{V}^{n_1}_1##: ##|u_1\rangle, |u_2\rangle ... |u_{n_1}\rangle##
##| V_2 \rangle \in \mathbb{V}^{n_2}_2## and there is an orthonormal basis in ##\mathbb{V}^{n_2}_2##: ##|w_1\rangle, |w_2\rangle ... |w_{n_2}\rangle##
The existence of such orthonormal basis is ensured by Gram-Schmidt theorem.

##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
##|V\rangle \in \mathbb{V} = \mathbb{V}_1 \oplus \mathbb{V}_2##
##|V\rangle = |V_1\rangle + |V_2\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle## (1)

Each sum in equation (1) is formed from linearly independent vectors (because they are the multiples of basis vectors). We need to show that together both sets of vectors from the sum are linearly independent with each other - that will ensure that the dimension of the sum space will be at least ##n_1+n_2##. And it can't be bigger than ##n_1+n_2##, because all vectors from ##\mathbb{V}## can be represented with equation (1) by the definition of subspace sum, and so there can't be an additional vector that is linearly independent with all the others and is used in component representation of some vector ##|V\rangle##.

To show the linear independence of two orthogonal basis, that also have the property of each element of one to be orthogonal with all elements of the other the linear combination needs to be written and shown to be trivial:
##\sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle = 0##
Apply inner product to both sides with all ##\langle u_i |## and ##\langle w_j |## we will get correspondingly:
##|v_i\rangle = 0## and ##|v_j\rangle = 0## - for same subspace elements due to all the other vectors being orthogonal as members of orthonormal basis and for foreign subspace elements due to all other vectors being orthogonal by the property of ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##.

Thus the dimension of the sum of subspaces is exactly ##n_1+n_2##.

This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
 
Physics news on Phys.org
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.
 
  • Like
Likes   Reactions: Virgileo
Virgileo said:
##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
 
  • Like
Likes   Reactions: Virgileo
Office_Shredder said:
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.

Yeah, they are indeed just numbers. About the triangle inequality maybe it is something to do with the fact that if we take a vector which has component representation in an orthonormal basis of all ones:
$$\begin{bmatrix}
1 \\
1 \\
... \\
1
\end{bmatrix}$$
Then the inner product of this vector with itself will yield dimension of the space. I was thinking maybe to take vectors ##|V_1\rangle## and ##|V_2\rangle## as such vectors, and then look the inner product of their sum with itself. But after playing with it it still lead me nowhere...

Anyway, thanks for checking.

vela said:
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
Yeah you are right, I was being lazy with this notation...
 
Virgileo said:
This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
I took a look in the second edition of Shankar I found online. I think it was just a typo. The hint should have referred to theorem 4, which has to do with the dimension of a vector space.
 
  • Like
Likes   Reactions: Virgileo

Similar threads

  • · Replies 45 ·
2
Replies
45
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
34
Views
3K
Replies
15
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K