Intersection of Subspaces in R^4: Finding Bases and Dimensions

In summary: I don't know. I think that the rank would be even lower, if I'm not wrong.If the rank of the intersection is 3, what can we say about it?I think it means that they are linearly independent vectors, right?What about if the rank of the sum is 3?I don't know.What about if the rank of the sum is 4?I think it means that the rank of the intersection is ##3## and so the sum will have a rank of ##4##.In summary, the exercise is to determine the base and dimension of the subspaces of ##\mathbb {R}^4##, ##U \cap Ker(f)## and ##U + Ker
  • #1
Kernul
211
7

Homework Statement


I have this exercise that tells me to determine a base and the dimension of the subspaces of ##\mathbb {R}^4##, ##U \cap Ker(f)## and ##U + Ker(f)##, knowing that:
##U = <\begin{pmatrix}
-10 \\
11 \\
2 \\
9
\end{pmatrix}
\begin{pmatrix}
1 \\
1 \\
1 \\
3
\end{pmatrix}
\begin{pmatrix}
14 \\
-7 \\
2 \\
21
\end{pmatrix}
\begin{pmatrix}
11 \\
-10 \\
-1 \\
12
\end{pmatrix}>##
##B_{Ker(f)} = \left[\begin{pmatrix}
1/2 \\
0 \\
0 \\
1
\end{pmatrix}
\begin{pmatrix}
0 \\
0 \\
1 \\
0
\end{pmatrix}
\begin{pmatrix}
-3/2 \\
1 \\
0 \\
0
\end{pmatrix} \right]## e ##dim(Ker(f)) = 3##

Homework Equations

The Attempt at a Solution


Now, I see that the dimension of ##U## is ##4##.
I proceed now by finding the vector ##\vec v \in Ker(f)##, so I will have
##\vec v = \alpha \begin{pmatrix}
1/2 \\
0 \\
0 \\
1
\end{pmatrix} + \beta \begin{pmatrix}
0 \\
0 \\
1 \\
0
\end{pmatrix} + \gamma \begin{pmatrix}
-3/2 \\
1 \\
0 \\
0
\end{pmatrix} = \begin{pmatrix}
1/2 \alpha - 3/2 \gamma \\
\gamma \\
\beta \\
\alpha
\end{pmatrix}##

How should I proceed now? Do I have to take one of the vector of ##U## and match it with the one just found for ##Ker(f)##?
What I mean is:
I take the first vector of ##U## and do ##-10(1/2 \alpha - 3/2 \gamma) + 11(\gamma) + 2(\beta) + 9(\alpha) = 0## and then I continue by finding that ##\beta = -2\alpha - 13\gamma## and I substitute it in the vector found before with ##Ker(f)##, finding then that the dimension of the intersection is ##2##.
It's just that after, when I use Sarruss to find the dimension of the sum, I found myself with ##5## as a dimension, and so ##U + Ker(f) \in \mathbb {R}^5##. Is it normal that the sum is this big so that we have ##\mathbb {R}^5## even though we started with ##\mathbb {R}^4##? It seems weird to me.
 
Physics news on Phys.org
  • #2
Kernul said:
Now, I see that the dimension of U is 4.
How do you know? The dimension is only 4 if the four vectors given are linearly independent (LI). So the first step is to find out if they are. One way to do that is to find the determinant of the 4 x 4 matrix they make. They are LI iff that determinant is nonzero.

If they are LI then ##U=\mathbb R^4## so the problem becomes trivial. Hence I doubt they will be LI.

Note that a subspace of a vector space with finite dimension n cannot have dimension greater than n, and if the dimensions are equal then the subspace is the whole space.
 
  • #3
andrewkirk said:
The dimension is only 4 if the four vectors given are linearly independent (LI).
Oh! So the dimension is only the number of linearly independent vectors? Then the dimension is ##3##! And so I reach the conclusion that the sum is ##\mathbb {R}^4##!
But is the part where I find the vector ##\vec v \in U## correct?
 
  • #4
Kernul said:
Oh! So the dimension is only the number of linearly independent vectors?
Please be more specific. The dimension of what? U?
Kernul said:
Then the dimension is ##3##! And so I reach the conclusion that the sum is ##\mathbb {R}^4##!
But is the part where I find the vector ##\vec v \in U## correct?
Are you supposed to find the sum U + B, or the direct sum U ⊕ B? Here's a page that discusses the differences.
 
  • #5
Kernul said:
so I reach the conclusion that the sum is ##\mathbb {R}^4##!
Not necessarily. You need to rule out the possibility that ##Ker(f)\subseteq U##. If that is the case then the sum will simply be ##U##, and will have dimension 3.
But is the part where I find the vector ##\vec v \in U## correct?
I don't think the approach you are taking will work, but I don't fully understand what you are trying to do, so I can't be sure. Note that it is possible for none of the three generators of ##Ker(f)## to lie in ##U## but nevertheless for the intersection to be nontrivial.

I expect there will be a slick, standard way of solving this problem. If you haven't been given that, you will need to work from first principles.

If you transpose the U matrix and then row reduce it to get matrix M whose bottom row will be zero (based on the fact that you've already discovered it has rank 3), what can we say about the rowspace of M? If you then replace the bottom (zero) row of that matrix by each of the transposed column vectors of ##B_{Ker(f)}## in turn you will get 4 x 4 matrices M1, M2, M3.

What do the ranks of those three matrices tell us about the ranks of ##U\cap B_{Ker(f)}## and ##U+ B_{Ker(f)}##?

If the rank of the intersection is 3, what can we say about it? What about if the rank of the sum is 3? What about if the rank of the sum is 4?
 
  • #6
Mark44 said:
Please be more specific. The dimension of what? U?
Yes, of ##U##.
Mark44 said:
Are you supposed to find the sum U + B, or the direct sum U ⊕ B?
No, the normal sum ##U + B##.
 
  • #7
andrewkirk said:
If you transpose the U matrix and then row reduce it to get matrix M whose bottom row will be zero (based on the fact that you've already discovered it has rank 3), what can we say about the rowspace of M? If you then replace the bottom (zero) row of that matrix by each of the transposed column vectors of BKer(f)BKer(f)B_{Ker(f)} in turn you will get 4 x 4 matrices M1, M2, M3.
I correct myself. The rank is ##2##, not ##3##. I didn't see it was possible to reduce it more. This is the matrix I end up with:
\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0
\end{pmatrix}
Should I still just replace the last row with a transposed column of ##B_{Ker(f)}##? Or take two at a time until I have three combinations?

andrewkirk said:
What do the ranks of those three matrices tell us about the ranks of U∩BKer(f)U∩BKer(f)U\cap B_{Ker(f)} and U+BKer(f)U+BKer(f)U+ B_{Ker(f)}?
It's a way to see if the vectors of ##U## and the vectors of ##Ker(f)## are linearly independent, right? If they are, then the rank would be ##3##(in this case, since I found out that the rank of ##U## is ##2##). If they are linearly dependent, it means that the vector is not part of it, and so you have rank ##2##. Then you count the matrices that have rank ##3## and it gives you the total rank of the intersection, and so the dimension, which would be the same as the rank. For the sum, I don't quite get it. I've always used the Grassman formula for calculating it. ##dim(U + Ker(f)) = dim(U) + dim(Ker(f)) - dim(U \cap Ker(f))##

andrewkirk said:
If the rank of the intersection is 3, what can we say about it? What about if the rank of the sum is 3? What about if the rank of the sum is 4?
If the rank of one of the two is ##3##, it means that the other one has to be ##1##. If the rank of the sum is ##4##, it means that it's a direct sum, right?
 
  • #8
Kernul said:
I correct myself. The rank is ##2##, not ##3##. I didn't see it was possible to reduce it more. This is the matrix I end up with:
\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0
\end{pmatrix}
You had it right the first time, I believe. I used Mathematica to row-reduce the matrix for U and found it was rank 3.
 
  • #9
I agree with @vela. I had a calc in R that gave the same answer:
Code:
A=matrix(c(-10,11,2,9,1,1,1,3,14,-7,2,21,11,-10,-1,12),dim=c(4,4))
qr(A)$rank
Given that the rank is 3, that makes the second part of the problem - the sum of the subspaces - quite easy, using the approach discussed.

However:
Kernul said:
It's a way to see if the vectors of ##U## and the vectors of ##Ker(f)## are linearly independent, right? If they are, then the rank would be ##3##(in this case, since I found out that the rank of ##U## is ##2##). If they are linearly dependent, it means that the vector is not part of it, and so you have rank ##2##. Then you count the matrices that have rank ##3## and it gives you the total rank of the intersection, and so the dimension, which would be the same as the rank.
This doesn't look quite right. You've got the general idea, but a few things seem to be the wrong way around. Try rewriting it carefully using a corrected row reduction of the transpose - which should only have one zero row, and I think you'll get there.
 
  • #10
vela said:
You had it right the first time, I believe. I used Mathematica to row-reduce the matrix for U and found it was rank 3.
Yeah, the second time I did I put ##-9## instead of ##9##. Now it's ##3## for me too.

andrewkirk said:
This doesn't look quite right. You've got the general idea, but a few things seem to be the wrong way around. Try rewriting it carefully using a corrected row reduction of the transpose - which should only have one zero row, and I think you'll get there.
Okay, since the row-reduced matrix of ##U## is the following:
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
0 & 0 & 0 & 0 \\
\end{pmatrix}##
I have to replace the last row with a transpose of one of the column of ##Ker(f)##, so:
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
\frac{1}{2} & 0 & 0 & 1 \\
\end{pmatrix}##
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
0 & 0 & 1 & 0 \\
\end{pmatrix}##
##\begin{pmatrix}
-10 & 1 & 14 & 11 \\
0 & \frac{21}{10} & \frac{42}{5} & \frac{21}{10} \\
0 & 0 & 18 & 18 \\
-\frac{3}{2} & 1 & 0 & 0 \\
\end{pmatrix}##
All three matrix have rank ##4##, so the dimension of the intersection is ##3## because all the rows are linearly independent for each matrix, right?
After this, I can simply use Graussman for the sum and have:
##dim(U + Ker(f)) = 3 + 3 - 3 = 3## and so the base is in ##\mathbb {R}^3##, right?
 
  • #11
Kernul said:
All three matrix have rank 4, so the dimension of the intersection is 3 because all the rows are linearly independent for each matrix, right?
The justification for that argument is not apparent to me. Can you elaborate?

I approach it by considering the sum of the spaces first. The dimension of the sum must be at least that of the rank of each of the three matrices. If that's 4, what else can we say about the sum?
 
  • #12
Hmm... I think I got it wrong then.
The three matrix have rank ##4##. Finding out the rank is basically finding out if the vectors of ##Ker(f)## with the vectors of ##U## are linearly independent. Now that I found out that they are, what do they tell me? I think this is what I don't get.

andrewkirk said:
I approach it by considering the sum of the spaces first. The dimension of the sum must be at least that of the rank of each of the three matrices. If that's 4, what else can we say about the sum?
You're saying that you don't use Graussman, right? Which means that I have to do ##<U \cup Ker(f)>## and see if the vectors are linearly independent again, right? I don't know what we can say if it's ##4##.
 
  • #13
Kernul said:
I don't know what we can say if it's 4.
If ##W## is a subspace of ##V## and they have the same finite dimension, then ##W=V##. This is sometimes given as a theorem, but it's pretty easy to prove.

Now let ##W## be the rowspace of any of those three matrices. Can you show that it is contained in ##U+Ker(f)##? If so then, given the above, what space is ##U+Ker(f)##?
Kernul said:
You're saying that you don't use Graussman, right?
I would use that formula to get the dimension of ##U\cap Ker(f)##, once I had determined the dimension of ##U+Ker(f)##.
 
  • #14
andrewkirk said:
Now let WWW be the rowspace of any of those three matrices.
You mean ##Ker(f)## with ##W##? And with "Let ##W## be the rowspace" you mean to put each vector as a row replacing the ##0## row, right?
 

1. What is the definition of intersection of two subspaces?

The intersection of two subspaces is the set of all elements that are common to both subspaces. In other words, it is the set of all vectors that belong to both subspaces.

2. How is the intersection of two subspaces represented mathematically?

The intersection of two subspaces is represented as the intersection of their corresponding basis vectors. This can be written as the span of the basis vectors that belong to both subspaces.

3. What is the dimension of the intersection of two subspaces?

The dimension of the intersection of two subspaces is equal to the minimum of the dimensions of the two subspaces. In other words, it is the number of basis vectors that are common to both subspaces.

4. Can the intersection of two subspaces be empty?

Yes, the intersection of two subspaces can be empty. This occurs when the two subspaces do not have any common elements, or when one subspace is a subset of the other.

5. How is the intersection of two subspaces useful in linear algebra?

The intersection of two subspaces is useful in determining the relationship between two subspaces. It can help identify if the subspaces are parallel, orthogonal, or if one subspace is contained within the other. It is also useful in solving systems of linear equations.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
100
  • Calculus and Beyond Homework Help
Replies
6
Views
312
  • Calculus and Beyond Homework Help
Replies
6
Views
672
  • Calculus and Beyond Homework Help
Replies
2
Views
673
  • Calculus and Beyond Homework Help
Replies
3
Views
886
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
533
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
819
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
Back
Top