Dimension of orthogonal subspaces sum

Virgileo
Messages
2
Reaction score
0
Homework Statement
Suppose ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2## are two subspaces such that any element of ##\mathbb{V}_1## is orthogonal to any element of ##\mathbb{V}_2##. Show that the dimensionality of ##\mathbb{V}_1 \oplus \mathbb{V}_2## is ##n_1+n_2##. (Hint: the triangle inequality).
Relevant Equations
The triangle inequality:
$$|V + W| \leq |V| + |W|$$

Inner product and the norm:
$$\langle V | W \rangle = \sum_{i=1}^n v_i w_i$$, where ##v_i## and ##w_i## are components of vectors ##V## and ##W## in some orthonormal basis ##{| u_i \rangle}##.
$$|V|^2 = \langle V | V \rangle$$

Overall this question is from Shankar's quantum mechanics book and he uses Dirac notation from the beginning to tell the linear algebra needed for the course, so I would use this notation too.
##| V_1 \rangle \in \mathbb{V}^{n_1}_1## and there is an orthonormal basis in ##\mathbb{V}^{n_1}_1##: ##|u_1\rangle, |u_2\rangle ... |u_{n_1}\rangle##
##| V_2 \rangle \in \mathbb{V}^{n_2}_2## and there is an orthonormal basis in ##\mathbb{V}^{n_2}_2##: ##|w_1\rangle, |w_2\rangle ... |w_{n_2}\rangle##
The existence of such orthonormal basis is ensured by Gram-Schmidt theorem.

##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
##|V\rangle \in \mathbb{V} = \mathbb{V}_1 \oplus \mathbb{V}_2##
##|V\rangle = |V_1\rangle + |V_2\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle## (1)

Each sum in equation (1) is formed from linearly independent vectors (because they are the multiples of basis vectors). We need to show that together both sets of vectors from the sum are linearly independent with each other - that will ensure that the dimension of the sum space will be at least ##n_1+n_2##. And it can't be bigger than ##n_1+n_2##, because all vectors from ##\mathbb{V}## can be represented with equation (1) by the definition of subspace sum, and so there can't be an additional vector that is linearly independent with all the others and is used in component representation of some vector ##|V\rangle##.

To show the linear independence of two orthogonal basis, that also have the property of each element of one to be orthogonal with all elements of the other the linear combination needs to be written and shown to be trivial:
##\sum_{i=1}^{n_1} v_i |u_i\rangle + \sum_{j=1}^{n_2} v_j |w_j\rangle = 0##
Apply inner product to both sides with all ##\langle u_i |## and ##\langle w_j |## we will get correspondingly:
##|v_i\rangle = 0## and ##|v_j\rangle = 0## - for same subspace elements due to all the other vectors being orthogonal as members of orthonormal basis and for foreign subspace elements due to all other vectors being orthogonal by the property of ##\mathbb{V}^{n_1}_1## and ##\mathbb{V}^{n_2}_2##.

Thus the dimension of the sum of subspaces is exactly ##n_1+n_2##.

This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
 
Physics news on Phys.org
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.
 
Virgileo said:
##|V_1\rangle \in \mathbb{V}_1##, ##|V_1\rangle = \sum_{i=1}^{n_1} v_i |u_i\rangle##
##|V_2\rangle \in \mathbb{V}_2##, ##|V_2\rangle = \sum_{j=1}^{n_2} v_j |w_j\rangle##
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
 
Office_Shredder said:
I'm not super familiar with physics notation here, but I would have thought your final equations are just ##v_j=0## since they are numbers, not vectors. Other than that I think what you did is right here. I don't know what they are talking about with the triangle inequality either.

Yeah, they are indeed just numbers. About the triangle inequality maybe it is something to do with the fact that if we take a vector which has component representation in an orthonormal basis of all ones:
$$\begin{bmatrix}
1 \\
1 \\
... \\
1
\end{bmatrix}$$
Then the inner product of this vector with itself will yield dimension of the space. I was thinking maybe to take vectors ##|V_1\rangle## and ##|V_2\rangle## as such vectors, and then look the inner product of their sum with itself. But after playing with it it still lead me nowhere...

Anyway, thanks for checking.

vela said:
You should use different coefficients for the expansion of ##\lvert V_2 \rangle##. The coefficient ##v_1## in the expansion of ##\lvert V_1 \rangle## is not necessarily equal to the coefficient ##v_1## in the expansion of ##\lvert V_2 \rangle##, so you shouldn't use the same symbol in both places.
Yeah you are right, I was being lazy with this notation...
 
Virgileo said:
This proof seems sound to me, and I hope it is such. But aside from checking it I am interested in the hint about triangle inequality - how does one use it in this case to prove this fact?
I took a look in the second edition of Shankar I found online. I think it was just a typo. The hint should have referred to theorem 4, which has to do with the dimension of a vector space.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top