- #1
Evalesco
- 139
- 0
Hi, I've being going over some course material in preparation for my exams and I've come across an exercise which I can't seem to work out.
The exercise:
Let V be a vector space over a field F and let U and W be two subspaces of V.
Suppose V = U + W. Prove that V = U ⊕ W iff {(u,w)∈U×W : u + w = 0} = {(0,0)}
What I've done so far:
At first I was unsure of the operator ⊕ but from Wikipedia I discovered the following:
V = U ⊕ W ⇔ (V = U + W) ∧ (U ∩ W = ∅)
Which I guess is a consequence of the following: The dimension of V ⊕ W is equal to the sum of the dimensions of V and W.
So I thought ok that identity will make it quite easy to solve, I just need to make sure U and W share no common elements.
We are given {(u,w)∈U×W : u + w = 0} = {(0,0)}
Ok I thought, that uses the property that U and W are both closed under addition to ensure the two subspaces U and W have no non-zero elements in common.
But this is where I'm stuck I thought in order to solve it I would need U and V to share no common elements but 0 can be both an element of U and V.
I know this isn't a proof but I was just trying to formulate an idea of how I would go about solving it. Any insight would be welcome, Thanks.
The exercise:
Let V be a vector space over a field F and let U and W be two subspaces of V.
Suppose V = U + W. Prove that V = U ⊕ W iff {(u,w)∈U×W : u + w = 0} = {(0,0)}
What I've done so far:
At first I was unsure of the operator ⊕ but from Wikipedia I discovered the following:
V = U ⊕ W ⇔ (V = U + W) ∧ (U ∩ W = ∅)
Which I guess is a consequence of the following: The dimension of V ⊕ W is equal to the sum of the dimensions of V and W.
So I thought ok that identity will make it quite easy to solve, I just need to make sure U and W share no common elements.
We are given {(u,w)∈U×W : u + w = 0} = {(0,0)}
Ok I thought, that uses the property that U and W are both closed under addition to ensure the two subspaces U and W have no non-zero elements in common.
But this is where I'm stuck I thought in order to solve it I would need U and V to share no common elements but 0 can be both an element of U and V.
I know this isn't a proof but I was just trying to formulate an idea of how I would go about solving it. Any insight would be welcome, Thanks.