# [Linear Algebra] Show that H ∩ K is a subspace of V

## Homework Statement

From Linear Algebra and Its Applications, 5th Edition, David Lay
Chapter 4, Section 1, Question 32
Let H and K be subspaces of a vector space V. The intersection of H and K is the set of v in V that belong to both H and K. Show that H ∩ K is a subspace of V. (See figure.) Give an example in ℝ2 to show that the union of two subspaces is not, in general, a subspace. ## Homework Equations

/theorems[/B]
Theorem 1: If v1,...vp are in vector space V, then Span{v1,...vp} is a subspace of V.

## The Attempt at a Solution

This is what I started off with:
Let u,v ∈ H; s, t ∈ K
0v ∈ H, K
u + v ∈ H
s + t ∈ K

In the middle of writing that down, I was thinking that v is a set of vectors in V, and that H ∩ K = Span{v}, therefore, per Theorem 1 in the book, H ∩ K is a subspace of V.

Written as:
v ∈ V
H ∩ K = Span{v}
∴ H ∩ K is a subspace of V per Theorem 1.

For the second part I used the following:
m, n ∈ ℝ2
m = {[x, y]: x, y ≥ 0}
n = {[x, y]: x ≤ 0, y ≥ 0}
m ∪ n = {[x, y]: x = ℝ, y ≥ 0}
let c = -1, u = [-1, 3]
cu = [1, -3]
cu ∉ m ∪ n
∴ m ∪ n is not a subspace

Is my train of thinking correct concerning this problem?

#### Attachments

• 10 KB Views: 950

Related Calculus and Beyond Homework Help News on Phys.org
tnich
Homework Helper

## Homework Statement

From Linear Algebra and Its Applications, 5th Edition, David Lay
Chapter 4, Section 1, Question 32
Let H and K be subspaces of a vector space V. The intersection of H and K is the set of v in V that belong to both H and K. Show that H ∩ K is a subspace of V. (See figure.) Give an example in ℝ2 to show that the union of two subspaces is not, in general, a subspace.
View attachment 224838

## Homework Equations

/theorems[/B]
Theorem 1: If v1,...vp are in vector space V, then Span{v1,...vp} is a subspace of V.

## The Attempt at a Solution

This is what I started off with:
Let u,v ∈ H; s, t ∈ K
0v ∈ H, K
u + v ∈ H
s + t ∈ K

In the middle of writing that down, I was thinking that v is a set of vectors in V, and that H ∩ K = Span{v}, therefore, per Theorem 1 in the book, H ∩ K is a subspace of V.

Written as:
v ∈ V
H ∩ K = Span{v}
∴ H ∩ K is a subspace of V per Theorem 1.

For the second part I used the following:
m, n ∈ ℝ2
m = {[x, y]: x, y ≥ 0}
n = {[x, y]: x ≤ 0, y ≥ 0}
m ∪ n = {[x, y]: x = ℝ, y ≥ 0}
let c = -1, u = [-1, 3]
cu = [1, -3]
cu ∉ m ∪ n
∴ m ∪ n is not a subspace

Is my train of thinking correct concerning this problem?
For the first part, you would need to show that ##H \cap K## is the span of v. That is easy to do using the definition of subspace. Since I don't have your textbook at hand, I don't know what Theorem 1 says. Perhaps it covers the missing pieces.
For the second part, your example does not work because m and n are not subspaces of ℝ2 as you have defined them. (Check the definition of subspace again.)

• bornofflame
Mark44
Mentor

## Homework Statement

From Linear Algebra and Its Applications, 5th Edition, David Lay
Chapter 4, Section 1, Question 32
Let H and K be subspaces of a vector space V. The intersection of H and K is the set of v in V that belong to both H and K. Show that H ∩ K is a subspace of V. (See figure.) Give an example in ℝ2 to show that the union of two subspaces is not, in general, a subspace.
View attachment 224838

## Homework Equations

/theorems[/B]
Theorem 1: If v1,...vp are in vector space V, then Span{v1,...vp} is a subspace of V.
I don't see that this theorem is relevant.
bornofflame said:

## The Attempt at a Solution

This is what I started off with:
Let u,v ∈ H; s, t ∈ K
0v ∈ H, K
u + v ∈ H
s + t ∈ K
This isn't a good start, since you are already given that H and K are subspaces of V. So we know that both are closed under vector addition and scalar multiplication.
A better start would be to assume that u and v are arbitrary vectors in H ∩ K. Show that their sum is also in H ∩ K, and show that any scalar multiple of either one of them (say u) is also in H ∩ K.
bornofflame said:
In the middle of writing that down, I was thinking that v is a set of vectors in V, and that H ∩ K = Span{v}, therefore, per Theorem 1 in the book, H ∩ K is a subspace of V.
But some of these vectors v don't belong to H ∩ K, so I don't think this gets you anywhere.
bornofflame said:
Written as:
v ∈ V
H ∩ K = Span{v}
∴ H ∩ K is a subspace of V per Theorem 1.

For the second part I used the following:
m, n ∈ ℝ2
m = {[x, y]: x, y ≥ 0}
n = {[x, y]: x ≤ 0, y ≥ 0}
m ∪ n = {[x, y]: x = ℝ, y ≥ 0}
let c = -1, u = [-1, 3]
cu = [1, -3]
cu ∉ m ∪ n
∴ m ∪ n is not a subspace

Is my train of thinking correct concerning this problem?

• bornofflame
For the second part, your example does not work because m and n are not subspaces of ℝ2 as you have defined them. (Check the definition of subspace again.)
Oops. I either misread that or wasn't thinking. Either way. Ugh. I'll go check that now to see fix my mistake.

I don't see that this theorem is relevant.
This isn't a good start, since you are already given that H and K are subspaces of V. So we know that both are closed under vector addition and scalar multiplication.
I see what you mean. That didn't occur to me.

A better start would be to assume that u and v are arbitrary vectors in H ∩ K. Show that their sum is also in H ∩ K, and show that any scalar multiple of either one of them (say u) is also in H ∩ K.
But some of these vectors v don't belong to H ∩ K, so I don't think this gets you anywhere.
Second stab:
H, K are subspaces in V
H ∩ K = {v} in V
To be a subspace of V, H ∩ K must comply with the following:
1. Contain the 0 vector:
0v ∈ H ∩ K b/c H, K are subspaces and therefore contain the 0v

Let u, v ∈ H ∩ K, then u, v ∈ H, K which means that u + v ∈ H, K which means that u + v ∈ H ∩ K

3. Closed under scalar multiplication:
Let c ∈ ℝ, u ∈ H ∩ K, then u ∈ H, K and cu ∈ H, K which means that cu ∈ H ∩ K

It took me a second b/c I don't think I really understood what was being asked. Once that clicked, the work became much easier.

I'm going to work on part 2 now.

For part deux:
m, n are subspaces of V
To show that m ∪ n is NOT a subspace of V, it must fail one of the following:
1. Contains the 0v: yes, move on
Let m = {[x, y]: x = 0, y = ℝ}, n = {[x, y]: x = ℝ, y = 0}
Let u = [0, 1], v = [1, 0]
u + v ∉ m
u + v ∉ n
∴ u + v ∉ m ∪ n

Mark44
Mentor
For part deux:
m, n are subspaces of V
To show that m ∪ n is NOT a subspace of V, it must fail one of the following:
1. Contains the 0v: yes, move on
Let m = {[x, y]: x = 0, y = ℝ}, n = {[x, y]: x = ℝ, y = 0}
Let u = [0, 1], v = [1, 0]
u + v ∉ m
u + v ∉ n
∴ u + v ∉ m ∪ n
This is a good counterexample, but your notation could use a couple of tweaks.
Let M = {u ∈ R2 : x = 0, y ∈ R}, and let N = {v ∈ R2 : x ∈ R, y = 0 }
Let u = <0, 1>, v = <1, 0>
u + v = <1, 1> ∉ M ∪ N
So M ∪ N is not closed under vector addition, and isn't a subspace of R2

• bornofflame