Proving U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup \ ... \ \cup U_k)

  • I
  • Thread starter JD_PM
  • Start date
  • Tags
    Span
In summary: I still think that using induction simplifies things. In particular, all we really need to show is that:$$span(U_1 \cup \dots \cup U_{k+1}) = span(U_1 \cup \dots \cup U_k) + U_{k+1} $$The RHS is clearly a subset of the LHS, so we need to show that:$$v \in span(U_1 \cup \dots \cup U_{k+1}) \ \Rightarrow \ v \in span(U_1 \cup \dots \cup U_k) + U_{k+1}$$
  • #1
JD_PM
1,131
158
TL;DR Summary
I want to prove that ##U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup ... \cup U_k)## by induction
1) Base case ##k=2##

##U_1 + U_2 = span (U_1 \cup U_2)##, which I understand how to prove is OK.

2) Induction hypothesis

We assume that the following statement holds

$$U_1 + U_2 + ... + U_{k-1} = span (U_1 \cup U_2 \cup \ ... \ \cup U_{k-1})$$

3) Induction step

$$U_1 + U_2 + ... + U_k = \underbrace{U_1 + U_2 + ... + U_{k-1}}_{=span (U_1 \cup U_2 \cup \ ... \ \cup U_{k-1})} + U_k = \ ? $$

But my issue is that I do not see how to move from ##span (U_1 \cup U_2 \cup \ ... \ \cup U_{k-1})+U_k##

Might you guide me on how to proceed further?

Thanks! :biggrin:
 
Physics news on Phys.org
  • #2
To prove two sets, ##A## and ##B##, are equal you often show that ##x \in A## implies ##x \in B## and vice versa.
 
  • Like
Likes JD_PM
  • #3
Hi @PeroK !

PeroK said:
To prove two sets, ##A## and ##B##, are equal you often show that ##x \in A## implies ##x \in B## and vice versa.

Oh, I see! So let's forget about induction.

We take ##x=u_1 + u_2 + \dots + u_k \in U_1 + U_2 + \dots + U_k## where ##u_i \in U_i##.

Given that each element ##u_i## is found exclusively in ##U_i##, there is no difference between ##U_1 + U_2 + \dots + U_k## and ##U_1 \cup U_2 \cup \dots \cup U_k##. (let me know if you disagree with this statement).

Taking a linear combination of the elements ##u_1 + u_2 + \dots + u_k## leads to ##x = \sum\limits_{j = 1}^k \beta_j u_j \in span (U_1 \cup U_2 \cup \dots \cup U_k)##

Let's now go the other way and take ## x = \sum\limits_{j = 1}^r \beta_j u_j \in span (U_1 \cup U_2 \cup \dots \cup U_k)##. Next we construct sets: let ##K_1## be the set of indices ##j## such that ##u_j \in U_1##, ##K_2## be the set of indices ##j## not in ##K_1## such that ##u_j \in U_2## and so on up to ##u_j \in U_k##

Given that ##K_i## sets are disjoint, ##x = \sum\limits_{i = 1}^n \sum\limits_{j \in K_j} \beta_j u_j## and ##\sum\limits_{j \in K_i} \beta_j u_j \in U_i##, we get ##x \in U_1 + U_2 + \dots + U_k##

How do you see it? :)

PS: For what kind of proofs is best to use induction though?
 
  • #4
JD_PM said:
Hi @PeroK !
Oh, I see! So let's forget about induction.
That approach could apply to the inductive step!
 
  • #5
OK I misunderstood you then. Do you have any comments on #3 though? Or you want me to focus only on induction.
 
  • #6
JD_PM said:
OK I misunderstood you then. Do you have any comments on #3 though? Or you want me to focus only on induction.
JD_PM said:
Hi @PeroK !
Oh, I see! So let's forget about induction.

We take ##x=u_1 + u_2 + \dots + u_k \in U_1 + U_2 + \dots + U_k## where ##u_i \in U_i##.

Given that each element ##u_i## is found exclusively in ##U_i##, there is no difference between ##U_1 + U_2 + \dots + U_k## and ##U_1 \cup U_2 \cup \dots \cup U_k##. (let me know if you disagree with this statement).

Taking a linear combination of the elements ##u_1 + u_2 + \dots + u_k## leads to ##x = \sum\limits_{j = 1}^k \beta_j u_j \in span (U_1 \cup U_2 \cup \dots \cup U_k)##

Let's now go the other way and take ## x = \sum\limits_{j = 1}^r \beta_j u_j \in span (U_1 \cup U_2 \cup \dots \cup U_k)##. Next we construct sets: let ##K_1## be the set of indices ##j## such that ##u_j \in U_1##, ##K_2## be the set of indices ##j## not in ##K_1## such that ##u_j \in U_2## and so on up to ##u_j \in U_k##

Given that ##K_i## sets are disjoint, ##x = \sum\limits_{i = 1}^n \sum\limits_{j \in K_j} \beta_j u_j## and ##\sum\limits_{j \in K_i} \beta_j u_j \in U_i##, we get ##x \in U_1 + U_2 + \dots + U_k##

How do you see it? :)

PS: For what kind of proofs is best to use induction though?
Are you sure that the ##U_i## are disjoint? Are they not just any subspaces?

What I might have tried to do is to take ##x \in span(U_1 \cup \dots U_{k+1})## and show that ##x = u + u_{k+1}##, where ##u \in span(U_1 \cup \dots U_{k})## and ##u_{k+1} \in U_{k+1}##.

Then the inductive step should follow. Induction should simplify things, although it's the same basic idea.
 
  • #7
PeroK said:
Are you sure that the ##U_i## are disjoint? Are they not just any subspaces?

A more detailed proof without using induction can be found here.
 
  • #8
JD_PM said:
A more detailed proof without using induction can be found here.
I still think that using induction simplifies things. In particular, all we really need to show is that:
$$span(U_1 \cup \dots \cup U_{k+1}) = span(U_1 \cup \dots \cup U_k) + U_{k+1} $$
The RHS is clearly a subset of the LHS, so we need to show that:
$$v \in span(U_1 \cup \dots \cup U_{k+1}) \ \Rightarrow \ v \in span(U_1 \cup \dots \cup U_k) + U_{k+1}$$
An outline of the proof is that for such a vector ##v##, we take all the terms that involve vectors in ##U_{k+1}## and (using the fact that ##U_{k+1}## is a subspace) recognise that this gives a single element of ##U_{k+1}##. We also recognise that what is left is an element of ##span(U_1 \cup \dots \cup U_k)##.

You can formalise this easily. And this should be simpler than having all those different sets of indices.
 
  • Like
Likes JD_PM and StoneTemplePython
  • #9
edit: better to just echo what PeroK said.
 
Last edited:
  • #10
Thanks for the clarification @PeroK !

PeroK said:
The RHS is clearly a subset of the LHS, so we need to show that:
$$v \in span(U_1 \cup \dots \cup U_{k+1}) \ \Rightarrow \ v \in span(U_1 \cup \dots \cup U_k) + U_{k+1}$$
An outline of the proof is that for such a vector ##v##, we take all the terms that involve vectors in ##U_{k+1}## and (using the fact that ##U_{k+1}## is a subspace) recognise that this gives a single element of ##U_{k+1}##. We also recognise that what is left is an element of ##span(U_1 \cup \dots \cup U_k)##.

Let me check my understanding :)

OK so let us start by taking ##v \in span(U_1 \cup \dots \cup U_{k+1})##. Based on the definition of span, ##v = \sum\limits_{j = 1}^{k+1} \beta_j u_j = \sum\limits_{j = 1}^{k} \beta_j u_j + \beta_{k+1} u_{k+1}##. The first and second terms equal (by definition) to ##span(U_1 \cup \dots \cup U_k)## and ##span(U_{k+1})## respectively. Since ##U_{k+1}## is a subspace, it is closed under linear combinations so ##span(U_{k+1}) \subseteq U_{k+1}##. But by definition of span, we also have ##U_{k+1} \subseteq span(U_{k+1})##. Hence ##U_{k+1} = span(U_{k+1})## and indeed we see that ## v \in span(U_1 \cup \dots \cup U_k) + U_{k+1}##
 
  • #11
JD_PM said:
Thanks for the clarification @PeroK !
Let me check my understanding :)

OK so let us start by taking ##v \in span(U_1 \cup \dots \cup U_{k+1})##. Based on the definition of span, ##v = \sum\limits_{j = 1}^{k+1} \beta_j u_j = \sum\limits_{j = 1}^{k} \beta_j u_j + \beta_{k+1} u_{k+1}##.
It's more general than than: $$v = \sum_{j = 1}^n \beta_jv_j$$where ##v_j \in U_1 \cup \dots \cup U_{k+1}##. Now, we let $$u_{k+1} = \sum_j \beta_jv_j$$ where we sum over all ##j## where ##v_j \in U_{k+1}## and let $$u = \sum_j \beta_jv_j$$ where we sum over all ##j## where ##v_j \notin U_{k+1}##. This gives us $$v = u + u_{k+1}$$, where ##u \in span(U_1 \cup \dots \cup U_k)## and ##u_{k+1} \in U_{k+1}##, hence:
$$v \in span(U_1 \cup \dots \cup U_k) + U_{k+1}$$
That can be used in your inductive step.

Note also that the base case for ##k = 1## is that we have ##span(U_1) = U_1##.
 
  • Like
Likes JD_PM

FAQ: Proving U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup \ ... \ \cup U_k)

1. What does it mean to prove that U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup \ ... \ \cup U_k)?

Proving that U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup \ ... \ \cup U_k) means showing that every vector in the sum of subspaces U_1, U_2, ..., U_k can be written as a linear combination of vectors in the union of those subspaces. In other words, the sum of the subspaces is equal to the span of their union.

2. Why is it important to prove that U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup \ ... \ \cup U_k)?

Proving this equality is important because it allows us to understand the relationship between the sum of subspaces and the span of their union. This can be useful in solving problems related to linear algebra, such as finding bases for subspaces or determining if a set of vectors spans a given subspace.

3. What are the steps involved in proving U_1 + U_2 + ... + U_k = span (U_1 \cup U_2 \cup \ ... \ \cup U_k)?

The steps involved in proving this equality may vary depending on the specific situation, but generally involve showing that any vector in the sum of subspaces can be written as a linear combination of vectors in the union of those subspaces. This can be done by using properties of vector spaces, such as closure under addition and scalar multiplication, and by manipulating equations to show that one side is a subset of the other.

4. Can this equality be proven for any number of subspaces?

Yes, this equality can be proven for any number of subspaces. The proof may become more complex as the number of subspaces increases, but the basic concept remains the same.

5. How is this equality related to the concept of linear independence?

This equality is related to the concept of linear independence because it shows that the sum of subspaces can be written as a linear combination of vectors from the union of those subspaces. This is similar to the idea of linear independence, where a set of vectors is considered linearly independent if no vector in the set can be written as a linear combination of the others. In both cases, we are looking at the relationship between a set of vectors and their span.

Similar threads

Back
Top