MHB Direct Sum Property: Proving Uniqueness

Click For Summary
The discussion centers on proving the direct sum property in vector spaces. It establishes that the sum of subspaces \(V_1, \ldots, V_k\) is direct if and only if there exists at least one vector \(v\) in \(V\) that can be expressed uniquely as \(v = v_1 + \cdots + v_k\) with \(v_i \in V_i\). The author demonstrates the forward implication by referencing the definition of a direct sum, while the reverse implication is shown through a contradiction involving multiple representations of a vector. A minor typo in the proof is acknowledged and corrected, confirming the overall correctness of the solution. The author seeks validation for their understanding in preparation for an upcoming exam.
Sudharaka
Gold Member
MHB
Messages
1,558
Reaction score
1
Hi everyone, :)

I encountered this question and thought about it several hours. I am writing down my answer. I would greatly appreciate if somebody could find a fault in my answer or else confirm it is correct. :)

Problem:

Let \(V_1,\,\cdots,\,V_k\) be subspaces in a vector space \(V\), \(V=V_1+\cdots+V_k\). Show that the sum is direct iff there is at least one \(v\in V\) such that \(v=v_1+\cdots+v_k\) where \(v_i\in V_i\), in a unique way.

My Solution:

If \(V=V_1\oplus \cdots \oplus V_k\) then by the definition of the direct product each \(v\in V\) can be uniquely written as \(v=v_1+\cdots+v_k \) where \(v_i\in V_i\). So the forward implication is clearly true.

Now let us prove the reverse direction. There exist and element \(v\in V\) such that \(v=v_1+\cdots+v_k\) where \(v_i\in V_i\), in a unique way. Suppose that the sum is not a direct sum. Then there exist at least one element \(x\in V\) such that \(x\) has two different representations,

\[x=x_1+\cdots+x_k\mbox{ and }x=x'_1+\cdots+x'_k\]

where \(x_i,\,x'_i\in V_i\). Now there exist some \(v_0\in V\) such that, \(v=x+v_0\). Hence,

\[v=x_1+\cdots+x_k+v_0\mbox{ and }v=x'_1+\cdots+x'_k+v_0\]

Now since we can write \(v_0=v_1^0+\cdots+v_k^0\), we get,

\[v=(x_1+v^0_1)+\cdots+(x_k+v^0_k)\mbox{ and }v=(x'_1+v^0_1)+\cdots+(x'_k+v^0_k)\]

Note that, \(x_i+v_i^0\in V_i\) and \(x'_i+v_i^0\in V_i\).

But since \(v\) has a unique representation \(v=v_1+\cdots+v_k\) we have,

\[v_1=x_1+v^0_1=x'_1+v^0_1\]

\[v_2=x_2+v^0_2=x'_2+v^0_2\]

and generally,

\[v_i=x_i+v^0_i=x'_i+v^0_i\]

Therefore,

\[x_i=x'_i\mbox{ for all }i\]

Therefore we arrive at a contradiction. The sum must be direct.
 
Last edited:
Physics news on Phys.org
Sudharaka said:
Hi everyone, :)

I encountered this question and thought about it several hours. I am writing down my answer. I would greatly appreciate if somebody could find a fault in my answer or else confirm it is correct. :)

Problem:

Let \(V_1,\,\cdots,\,V_k\) be subspaces in a vector space \(V\), \(V=V_1+\cdots+V_k\). Show that the sum is direct iff there is at least one \(v\in V\) such that \(v=v_1+\cdots+v_k\) where \(v_i\in V_i\), in a unique way.

My Solution:

If \(V=V_1\oplus \cdots \oplus V_k\) then by the definition of the direct product each \(v\in V\) can be uniquely written as \(v=v_1+\cdots+v_k \) where \(v_i\in V_i\). So the forward implication is clearly true.

Now let us prove the reverse direction. There exist and element \(v\in V\) such that \(v=v_1+\cdots+v_k\) where \(v_i\in V_i\), in a unique way. Suppose that the sum is not a direct sum. Then there exist at least one element \(x\in V\) such that \(x\) has two different representations,

\[x=x_1+\cdots+x_k\mbox{ and }x=x'_1+\cdots+x'_k\]

where \(x_i,\,x'_i\in V_i\). Now there exist some \(v_0\in V\) such that, \(v=x+v_0\). Hence,

\[v=x_1+\cdots+x_k+v_0\mbox{ and }v=x'_1+\cdots+x'_k+v_0\]

Now since we can write \(v_0=v_1^0+\cdots+v_k^0\), we get,

\[v=(x_1+v^0_1)+\cdots+(x_k+v^0_k)\mbox{ and }v=(x'_1+v^0_1)+\cdots+(x'_k+v^0_k)\]

Note that, \(x_i+v_i^0\in V_i\) and \(x'_i+v_i^0\in V_i\).

But since \(v\) has a unique representation \(v=v_1+\cdots+v_k\) we have,

\[v_1=x_1+v^0_1=x'_1+v^0_1\]

\[v_2=x_2+v^0_1=x'_2+v^0_1\]

and generally,

\[v_i=x_i+v^0_1=x'_i+v^0_1\]
A typo here. It should be $v_i=x_i+v^0_i=x'_i+v^0_i$
The rest is correct.
 
caffeinemachine said:
A typo here. It should be $v_i=x_i+v^0_i=x'_i+v^0_i$
The rest is correct.

Yeah, I was typing it too fast, didn't check much for typos. I have edited it in the original post. :) Thank you very much for the confirmation and pointing out the typo. I really appreciate it. I am confident that this and the several other questions I posted recently are correct, but just want to get the opinion of everybody. I have a exam coming up and these are from a sample test. :)
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
770
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
842
Replies
4
Views
2K