Linear Algebra - adding subspaces

moonbounce7
Messages
3
Reaction score
0
1. Suppose that U is a subspace of V. What is U+U?


2. Homework Equations :
There's a theorem that states: Suppose that A and B are subspaces of V. Then V is the direct sum of A and B (written as A [plus with a circle around it] B) if and only if: 1) V=A+B (meaning, the two subspaces are technically able to be added), and 2) The intersection of A and B = {0}.



3. I don't understand what the addition symbol really means in this case... I know that addition of two subspaces is NOT the same as the union of two subspaces. The union of two subspaces of V is a subspace of V if and only if one of the subspaces is contained in the other. ...Help!
 
Physics news on Phys.org
The sum of two subspaces A and B is the set of all sums a + b such that a is in A and b is in B. The direct sum of two subspaces A and B as subspaces of V is the set of all vectors in V that can be written uniquely as ka + jb for k,j in R (if R is the field), a in A and b in B. This is the one that is usually denoted as a circle around the plus sign.
 
Thanks for trying to help, I think I've figured it out though:

The definition of addition among subspaces follows this definition:

U+U = {a+b, such that aЄU and bЄU}.

Since U is a subspace, addition is closed in U, so adding two elements in U would simply produce another element in U.

Therefore, U+U = U.

A justification would be a formal proof such that U ⊆ U+U and U+U ⊆ U.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top