Undergrad Proving that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##

  • Thread starter Thread starter JD_PM
  • Start date Start date
Click For Summary
The discussion centers on proving that the vector space V can be expressed as the direct sum of subspaces U_1, U_2, ..., U_k. The initial approach involves demonstrating that any vector v in V can be uniquely represented as a sum of vectors from each subspace. A contradiction is used to show that if two representations exist, they must be identical, confirming uniqueness. The necessity of showing that the intersection of any two distinct subspaces is trivial (only containing the zero vector) is also emphasized, with a focus on using linear independence to support this claim. Overall, the conversation highlights the importance of rigorous proof techniques in linear algebra.
JD_PM
Messages
1,125
Reaction score
156
TL;DR
I want to prove the following: Let ##V## be a vectorspace and ##\beta## a basis for $V$. Now make a partition of ##\beta## in a disjoint union of subsets ##\beta_1, \ldots, \beta_k## and let ##U_i = \text{span}(\beta_i)## for every ##i = 1, \ldots, k##. Prove then that ##V = U_1 \oplus U_2 \oplus \ldots \oplus U_k##.
Attempt:

Take an arbitrary vector ##v \in V##. Then we have to show that there are unique vectors ##u_1 \in U_1, u_2 \in U_2, \ldots, u_k \in U_k## such that \begin{align*} v = u_1 + u_2 + \ldots + u_k. \end{align*}

We prove by contradiction: Suppose there are two such ways, i.e. that \begin{align*} v= u_1' + u_2' + \ldots + u_k'. \end{align*} also holds. Then we have \begin{align*} \sum_{i=1}^k u_i = \sum_{i=1}^k u_i', \end{align*} or \begin{align*} (u_1 - u_1') + (u_2 - u_2') + \ldots + (u_k - u_k') = 0 \end{align*} The latter equation requires that ##u_i = u_i'##, a contradiction.

Do you agree? Or is there a neater way to show it? :)

Thanks! :biggrin:
 
Physics news on Phys.org
No, not really. We have to show that ##U_1 +\ldots+ U_k \supseteq V## which is immediately clear by writing
JD_PM said:
##v=u_1+u_2+…+u_k.##
You should have been more detailed here. Set ##\beta_{i}:=\{u_{i1},\ldots,u_{in_i}\}.## Then we have ##v=\sum_{i=1}^k\sum_{j=1}^{n_i}c_{ij}u_{ij}##. Now define ##u_i:=\sum_{j=1}^{n_i}c_{ij}u_{ij}## for all ## i ##. Then
JD_PM said:
##v=u_1+u_2+…+u_k.##
##\in U_1+\ldots+U_k## and thus ##V\subseteq U_1+\ldots+U_k.##

This would have been better than merely writing
JD_PM said:
##v=u_1+u_2+…+u_k.##
which should have been the conclusion, not the first line.

Finally, you have to show now, that ##U_i\cap U_j=\{0\}## for all ##1\leq i \neq j\leq k.##
 
Last edited:
  • Like
Likes JD_PM
I appreciate the detailed explanation!

Given that we want to prove an equality, shouldn't we also discuss why the inclusion ##U_1 +\ldots+ U_k \subseteq V## holds?

My reasoning: the sum of subspaces of ##V## yields a subspace of ##V## (proof).

fresh_42 said:
Finally, you have to show now, that ##U_i\cap U_j=\{0\}## for all ##1\leq i \neq j\leq k.##

Here's my attempt:

Let ##x \in U_i\cap U_j##. Then ##x = \underbrace{x}_{\in U_i} + \underbrace{0}_{\in U_j}## and ##x = \underbrace{0}_{\in U_i} + \underbrace{x}_{\in U_j}##.

Given that ##\beta_{i}:=\{u_{i1},\ldots,u_{in_i}\}## are disjoint subsets, the only element subspaces ##U_i## and ##U_j## have in common is the zero element. Hence, it follows that ##x=0## and ##U_i\cap U_j=\{0\}##
 
JD_PM said:
I appreciate the detailed explanation!

Given that we want to prove an equality, shouldn't we also discuss why the inclusion ##U_1 +\ldots+ U_k \subseteq V## holds?

My reasoning: the sum of subspaces of ##V## yields a subspace of ##V## (proof).
Here's my attempt:

Let ##x \in U_i\cap U_j##. Then ##x = \underbrace{x}_{\in U_i} + \underbrace{0}_{\in U_j}## and ##x = \underbrace{0}_{\in U_i} + \underbrace{x}_{\in U_j}##.

Given that ##\beta_{i}:=\{u_{i1},\ldots,u_{in_i}\}## are disjoint subsets, the only element subspaces ##U_i## and ##U_j## have in common is the zero element. Hence, it follows that ##x=0## and ##U_i\cap U_j=\{0\}##
This is not right. You need to use the linear independence of the basis vectors.
 
One of your problems is poor technique. For example, whenever you have ##x \in A \cap B## the next line automatically should be ##x \in A## and ##x \in B##.
 
PeroK said:
This is not right. You need to use the linear independence of the basis vectors.

OK, might you please provide more details?

PeroK said:
One of your problems is poor technique.

I take this comment as constructive criticism. I do my best to get better.

PeroK said:
For example, whenever you have ##x \in A \cap B## the next line automatically should be ##x \in A## and ##x \in B##.

Alright.
 
JD_PM said:
Here's my attempt:

Let ##x \in U_i\cap U_j##. Then ##x = \underbrace{x}_{\in U_i} + \underbrace{0}_{\in U_j}## and ##x = \underbrace{0}_{\in U_i} + \underbrace{x}_{\in U_j}##.
It's difficult to comment on that because it doesn't look like mathematics. It's like you are trying to formulate your ideas without the proper technique.

To follow on from the above:
$$x \in U_i \ \Rightarrow \ x = a\beta_i$$ for some scalar ##a##. Then the same argument for ##j## leads to ##a\beta_i = b\beta_j## which contradicts linear independence, unless ##a = b = 0##.
 
  • Like
Likes JD_PM

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K