Intersection of two subspaces

  • Thread starter Kernul
  • Start date
  • #1
211
7

Homework Statement


I have this exercise that tells me to determine a base and the dimension of the subspaces of ##\mathbb {R}^4##, ##U \cap Ker(f)## and ##U + Ker(f)##, knowing that:
##U = <\begin{pmatrix}
-10 \\
11 \\
2 \\
9
\end{pmatrix}
\begin{pmatrix}
1 \\
1 \\
1 \\
3
\end{pmatrix}
\begin{pmatrix}
14 \\
-7 \\
2 \\
21
\end{pmatrix}
\begin{pmatrix}
11 \\
-10 \\
-1 \\
12
\end{pmatrix}>##
##B_{Ker(f)} = \left[\begin{pmatrix}
1/2 \\
0 \\
0 \\
1
\end{pmatrix}
\begin{pmatrix}
0 \\
0 \\
1 \\
0
\end{pmatrix}
\begin{pmatrix}
-3/2 \\
1 \\
0 \\
0
\end{pmatrix} \right]## e ##dim(Ker(f)) = 3##

Homework Equations




The Attempt at a Solution


Now, I see that the dimension of ##U## is ##4##.
I proceed now by finding the vector ##\vec v \in Ker(f)##, so I will have
##\vec v = \alpha \begin{pmatrix}
1/2 \\
0 \\
0 \\
1
\end{pmatrix} + \beta \begin{pmatrix}
0 \\
0 \\
1 \\
0
\end{pmatrix} + \gamma \begin{pmatrix}
-3/2 \\
1 \\
0 \\
0
\end{pmatrix} = \begin{pmatrix}
1/2 \alpha - 3/2 \gamma \\
\gamma \\
\beta \\
\alpha
\end{pmatrix}##

How should I proceed now? Do I have to take one of the vector of ##U## and match it with the one just found for ##Ker(f)##?
What I mean is:
I take the first vector of ##U## and do ##-10(1/2 \alpha - 3/2 \gamma) + 11(\gamma) + 2(\beta) + 9(\alpha) = 0## and then I continue by finding that ##\beta = -2\alpha - 13\gamma## and I substitute it in the vector found before with ##Ker(f)##, finding then that the dimension of the intersection is ##2##.
It's just that after, when I use Sarruss to find the dimension of the sum, I found myself with ##5## as a dimension, and so ##U + Ker(f) \in \mathbb {R}^5##. Is it normal that the sum is this big so that we have ##\mathbb {R}^5## even though we started with ##\mathbb {R}^4##? It seems weird to me.
 

Answers and Replies

  • #2
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,914
1,475
Now, I see that the dimension of U is 4.
How do you know? The dimension is only 4 if the four vectors given are linearly independent (LI). So the first step is to find out if they are. One way to do that is to find the determinant of the 4 x 4 matrix they make. They are LI iff that determinant is nonzero.

If they are LI then ##U=\mathbb R^4## so the problem becomes trivial. Hence I doubt they will be LI.

Note that a subspace of a vector space with finite dimension n cannot have dimension greater than n, and if the dimensions are equal then the subspace is the whole space.
 
  • #3
211
7
The dimension is only 4 if the four vectors given are linearly independent (LI).
Oh! So the dimension is only the number of linearly independent vectors? Then the dimension is ##3##! And so I reach the conclusion that the sum is ##\mathbb {R}^4##!
But is the part where I find the vector ##\vec v \in U## correct?
 
  • #4
35,138
6,889
Oh! So the dimension is only the number of linearly independent vectors?
Please be more specific. The dimension of what? U?
Kernul said:
Then the dimension is ##3##! And so I reach the conclusion that the sum is ##\mathbb {R}^4##!
But is the part where I find the vector ##\vec v \in U## correct?
Are you supposed to find the sum U + B, or the direct sum U ⊕ B? Here's a page that discusses the differences.
 
  • #5
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,914
1,475
so I reach the conclusion that the sum is ##\mathbb {R}^4##!
Not necessarily. You need to rule out the possibility that ##Ker(f)\subseteq U##. If that is the case then the sum will simply be ##U##, and will have dimension 3.
But is the part where I find the vector ##\vec v \in U## correct?
I don't think the approach you are taking will work, but I don't fully understand what you are trying to do, so I can't be sure. Note that it is possible for none of the three generators of ##Ker(f)## to lie in ##U## but nevertheless for the intersection to be nontrivial.

I expect there will be a slick, standard way of solving this problem. If you haven't been given that, you will need to work from first principles.

If you transpose the U matrix and then row reduce it to get matrix M whose bottom row will be zero (based on the fact that you've already discovered it has rank 3), what can we say about the rowspace of M? If you then replace the bottom (zero) row of that matrix by each of the transposed column vectors of ##B_{Ker(f)}## in turn you will get 4 x 4 matrices M1, M2, M3.

What do the ranks of those three matrices tell us about the ranks of ##U\cap B_{Ker(f)}## and ##U+ B_{Ker(f)}##?

If the rank of the intersection is 3, what can we say about it? What about if the rank of the sum is 3? What about if the rank of the sum is 4?
 
  • #6
211
7
Please be more specific. The dimension of what? U?
Yes, of ##U##.
Are you supposed to find the sum U + B, or the direct sum U ⊕ B?
No, the normal sum ##U + B##.
 
  • #7
211
7
If you transpose the U matrix and then row reduce it to get matrix M whose bottom row will be zero (based on the fact that you've already discovered it has rank 3), what can we say about the rowspace of M? If you then replace the bottom (zero) row of that matrix by each of the transposed column vectors of BKer(f)BKer(f)B_{Ker(f)} in turn you will get 4 x 4 matrices M1, M2, M3.
I correct myself. The rank is ##2##, not ##3##. I didn't see it was possible to reduce it more. This is the matrix I end up with:
\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0
\end{pmatrix}
Should I still just replace the last row with a transposed column of ##B_{Ker(f)}##? Or take two at a time until I have three combinations?

What do the ranks of those three matrices tell us about the ranks of U∩BKer(f)U∩BKer(f)U\cap B_{Ker(f)} and U+BKer(f)U+BKer(f)U+ B_{Ker(f)}?
It's a way to see if the vectors of ##U## and the vectors of ##Ker(f)## are linearly independent, right? If they are, then the rank would be ##3##(in this case, since I found out that the rank of ##U## is ##2##). If they are linearly dependent, it means that the vector is not part of it, and so you have rank ##2##. Then you count the matrices that have rank ##3## and it gives you the total rank of the intersection, and so the dimension, which would be the same as the rank. For the sum, I don't quite get it. I've always used the Grassman formula for calculating it. ##dim(U + Ker(f)) = dim(U) + dim(Ker(f)) - dim(U \cap Ker(f))##

If the rank of the intersection is 3, what can we say about it? What about if the rank of the sum is 3? What about if the rank of the sum is 4?
If the rank of one of the two is ##3##, it means that the other one has to be ##1##. If the rank of the sum is ##4##, it means that it's a direct sum, right?
 
  • #8
vela
Staff Emeritus
Science Advisor
Homework Helper
Education Advisor
15,097
1,675
I correct myself. The rank is ##2##, not ##3##. I didn't see it was possible to reduce it more. This is the matrix I end up with:
\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0
\end{pmatrix}
You had it right the first time, I believe. I used Mathematica to row-reduce the matrix for U and found it was rank 3.
 
  • #9
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,914
1,475
I agree with @vela. I had a calc in R that gave the same answer:
Code:
A=matrix(c(-10,11,2,9,1,1,1,3,14,-7,2,21,11,-10,-1,12),dim=c(4,4))
qr(A)$rank
Given that the rank is 3, that makes the second part of the problem - the sum of the subspaces - quite easy, using the approach discussed.

However:
It's a way to see if the vectors of ##U## and the vectors of ##Ker(f)## are linearly independent, right? If they are, then the rank would be ##3##(in this case, since I found out that the rank of ##U## is ##2##). If they are linearly dependent, it means that the vector is not part of it, and so you have rank ##2##. Then you count the matrices that have rank ##3## and it gives you the total rank of the intersection, and so the dimension, which would be the same as the rank.
This doesn't look quite right. You've got the general idea, but a few things seem to be the wrong way around. Try rewriting it carefully using a corrected row reduction of the transpose - which should only have one zero row, and I think you'll get there.
 
  • #10
211
7
You had it right the first time, I believe. I used Mathematica to row-reduce the matrix for U and found it was rank 3.
Yeah, the second time I did I put ##-9## instead of ##9##. Now it's ##3## for me too.

This doesn't look quite right. You've got the general idea, but a few things seem to be the wrong way around. Try rewriting it carefully using a corrected row reduction of the transpose - which should only have one zero row, and I think you'll get there.
Okay, since the row-reduced matrix of ##U## is the following:
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
0 & 0 & 0 & 0 \\
\end{pmatrix}##
I have to replace the last row with a transpose of one of the column of ##Ker(f)##, so:
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
\frac{1}{2} & 0 & 0 & 1 \\
\end{pmatrix}##
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
0 & 0 & 1 & 0 \\
\end{pmatrix}##
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
-\frac{3}{2} & 1 & 0 & 0 \\
\end{pmatrix}##
All three matrix have rank ##4##, so the dimension of the intersection is ##3## because all the rows are linearly independent for each matrix, right?
After this, I can simply use Graussman for the sum and have:
##dim(U + Ker(f)) = 3 + 3 - 3 = 3## and so the base is in ##\mathbb {R}^3##, right?
 
  • #11
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,914
1,475
All three matrix have rank 4, so the dimension of the intersection is 3 because all the rows are linearly independent for each matrix, right?
The justification for that argument is not apparent to me. Can you elaborate?

I approach it by considering the sum of the spaces first. The dimension of the sum must be at least that of the rank of each of the three matrices. If that's 4, what else can we say about the sum?
 
  • #12
211
7
Hmm... I think I got it wrong then.
The three matrix have rank ##4##. Finding out the rank is basically finding out if the vectors of ##Ker(f)## with the vectors of ##U## are linearly independent. Now that I found out that they are, what do they tell me? I think this is what I don't get.

I approach it by considering the sum of the spaces first. The dimension of the sum must be at least that of the rank of each of the three matrices. If that's 4, what else can we say about the sum?
You're saying that you don't use Graussman, right? Which means that I have to do ##<U \cup Ker(f)>## and see if the vectors are linearly independent again, right? I don't know what we can say if it's ##4##.
 
  • #13
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,914
1,475
I don't know what we can say if it's 4.
If ##W## is a subspace of ##V## and they have the same finite dimension, then ##W=V##. This is sometimes given as a theorem, but it's pretty easy to prove.

Now let ##W## be the rowspace of any of those three matrices. Can you show that it is contained in ##U+Ker(f)##? If so then, given the above, what space is ##U+Ker(f)##?
You're saying that you don't use Graussman, right?
I would use that formula to get the dimension of ##U\cap Ker(f)##, once I had determined the dimension of ##U+Ker(f)##.
 
  • #14
211
7
Now let WWW be the rowspace of any of those three matrices.
You mean ##Ker(f)## with ##W##? And with "Let ##W## be the rowspace" you mean to put each vector as a row replacing the ##0## row, right?
 

Related Threads on Intersection of two subspaces

  • Last Post
Replies
9
Views
2K
Replies
8
Views
603
  • Last Post
Replies
1
Views
774
  • Last Post
Replies
4
Views
6K
Replies
4
Views
7K
  • Last Post
Replies
8
Views
2K
Replies
2
Views
1K
Replies
2
Views
5K
Replies
2
Views
2K
Top