- #1
geoffrey159
- 535
- 72
Homework Statement
Let ##E## be a finite dimensional vector space, ##A## and ##B## two subspaces with the same dimension.
Show there is a subspace ##S## of ##E## such that ##E = A \bigoplus S = B \bigoplus S ##
Homework Equations
[/B]
##\text{dim}(E) = n##
##\text{dim}(A) = \text{dim}(B) = m \le n ##
## {\cal B} = (e_1,...,e_n) ## is a basis of ##E##
## A = \text{span}(e_{i_1},...,e_{i_m}) ##
## B = \text{span}(e_{j_1},...,e_{j_m}) ##
## S_A = \text{span}((e_i)_{i \neq i_1,...,i_m } )##
## S_B = \text{span}((e_i)_{i \neq j_1,...,j_m } )##
## E = A \bigoplus S_A = B \bigoplus S_B ##
The Attempt at a Solution
I find this exercise hard and I can't finish it. Could you help please ?The case ##A = B## is easy, ## S = S_A = S_B ##
I assume now that ##A\neq B ##. I want to show the result by induction based on the decreasing dimensions of ##A## and ##B##.
1 - If ## m = n ## then ## A = B = E ## and ## S = \{0\} ## works
2 - If A and B are hyperplanes ## m = n-1 ##, then ## S_A = \text{span}(e_k) ##, and ## S_B = \text{span}(e_\ell) ##, with ##k\neq \ell##.
Put ## S = \text{span}(e_k + e_\ell) ##.
- Then for any ##x \in A\cap S##, there are scalars ##(\lambda_i)_{i\neq k}## and ##\mu ## such that ## x = \sum_{i\neq k } \lambda_i e_{i} = \mu (e_k + e_\ell ) \Rightarrow 0 = \mu (e_k + e_\ell ) - \sum_{i\neq k } \lambda_i e_{i} = \mu e_k + ( \mu - \lambda_l) e_l - \sum_{i\neq k,l } \lambda_i e_{i} ##. Since ##{\cal B}## is a basis of ##E##, then ##\mu = 0## and ##x = 0##. So ##A \cap S =\{0\}##. We can do the same for ##B## be so that ##B\cap S=\{0\}##
- For any ##x \in E##, there are scalars ##(\lambda_i)_{i = 1...n}## such that ## x = \sum_{i= 1}^n \lambda_i e_i ##. Reordering the terms,
## x = \lambda_k (e_k + e_\ell) + ((\lambda_\ell - \lambda_k) e_\ell + \sum_{i\neq k,\ell}^n \lambda_i e_i) \Rightarrow x \in S + A \Rightarrow S + A = E ##
## x = \lambda_l (e_k + e_\ell) +( (\lambda_k - \lambda_\ell) e_k + \sum_{i\neq k,\ell}^n \lambda_i e_i) \Rightarrow x \in S + B \Rightarrow S + B = E ##
The two points above show that ##E = A \bigoplus S = B \bigoplus S ##- For any ##x \in E##, there are scalars ##(\lambda_i)_{i = 1...n}## such that ## x = \sum_{i= 1}^n \lambda_i e_i ##. Reordering the terms,
## x = \lambda_k (e_k + e_\ell) + ((\lambda_\ell - \lambda_k) e_\ell + \sum_{i\neq k,\ell}^n \lambda_i e_i) \Rightarrow x \in S + A \Rightarrow S + A = E ##
## x = \lambda_l (e_k + e_\ell) +( (\lambda_k - \lambda_\ell) e_k + \sum_{i\neq k,\ell}^n \lambda_i e_i) \Rightarrow x \in S + B \Rightarrow S + B = E ##
3 - Assume that it works for ## m = n,n-1,...,r+1 ##. I want to show it works for ##m = r ##
- If ##S_A \cap S_B \neq \emptyset ##, then there is a vector ##e_k \in S_A \cap S_B ## such that ## S_A = \text{span}(e_k)\bigoplus S_A'##, and ##S_B = \text{span}(e_k)\bigoplus S_B'##, where ## S'_A = \text{span}((e_i)_{i \neq k,i_1,...,i_m } )##
and ## S'_B = \text{span}((e_i)_{i \neq k,j_1,...,j_m } )##
So ## E = (A \bigoplus \text{span}(e_k)) \bigoplus S_A' = (B \bigoplus \text{span}(e_k)) \bigoplus S_B' ##. By induction hypothesis, there is a subspace ##S'## such that :
## E = (A \bigoplus \text{span}(e_k)) \bigoplus S' = (B \bigoplus \text{span}(e_k)) \bigoplus S' ##. So ## S = \text{span}(e_k) \bigoplus S' ## works.
- If ##S_A \cap S_B = \emptyset ##, I don't know how to finish that part !
and ## S'_B = \text{span}((e_i)_{i \neq k,j_1,...,j_m } )##
So ## E = (A \bigoplus \text{span}(e_k)) \bigoplus S_A' = (B \bigoplus \text{span}(e_k)) \bigoplus S_B' ##. By induction hypothesis, there is a subspace ##S'## such that :
## E = (A \bigoplus \text{span}(e_k)) \bigoplus S' = (B \bigoplus \text{span}(e_k)) \bigoplus S' ##. So ## S = \text{span}(e_k) \bigoplus S' ## works.
- If ##S_A \cap S_B = \emptyset ##, I don't know how to finish that part !
Last edited: