# Decomposition of a complex vector space into 2 T-invariant subspaces

1. Nov 22, 2008

### winter85

1. The problem statement, all variables and given/known data
Suppose V is a complex vector space and $$T \in L(V)$$. Prove that
there does not exist a direct sum decomposition of V into two
proper subspaces invariant under T if and only if the minimal
polynomial of T is of the form $$(z - \lambda)^{dim V}$$ for some $$\lambda \in C$$.

2. Relevant equations

3. The attempt at a solution

First suppose that the minimal polynomial of T is $$p(z) = (z - \lambda)^{n}$$ where n = dim V. Suppose further that $$V = U \oplus W$$ where U and W are T-invariant proper subspaces of V. So $$dim U \geq 1$$ and $$dim W \geq 1$$. Since n = dim U + dim W, we have that $$dim U \leq n-1$$ and $$dim W \leq n-1$$.
Now the minimal polynomial of T is $$p(z) = (z - \lambda)^{n}$$, therefore $$(T - \lambda I)^{n}v = 0$$ for all v in V but there is at least one v in V such that $$(T - \lambda I)^{n-1}v \neq 0$$. Because of the decomposition of V, we can write: v = u + w for some u in U and some w in W. Applying $$(T - \lambda I)^{n-1}$$ to both sides we get:

$$(T - \lambda I)^{n-1}v = (T - \lambda I)^{n-1}u + (T - \lambda I)^{n-1}w \neq 0$$

since both U and W are invariant under T, we have Tu in U and and $$-\lambda u \in U$$ so $$(T - \lambda I)u \in U$$. Thefore U (and similarly W) are both invariant under $$(T-\lambda I)$$, so invariant under $$(T-\lambda I)^{n-1}$$. $$(T - \lambda I)^{n-1}v \neq 0$$, so either one of $$(T - \lambda I)^{n-1}u$$ or $$(T - \lambda I)^{n-1}w$$ is not zero, because they're in different subspaces. Assume it's the first one. So there exists u in U such that

$$(T - \lambda I)^{n-1}u \neq 0$$. But $$(T - \lambda I)^{n}u = 0$$ (for all u in U). Constraining T to U, we see that the minimal polynomial of T on U is $$(z - \lambda I)^{n}$$. But $$dim U \leq n-1 < n$$, a contradiction.

I hope the reasoning above is correct; if it is, it proves that minimal polyomial of T is $$(z-\lambda I)^{n}$$ implies that T cannot be decomposed into the direct sum of two proper T-invariant subspaces.

However I'm stumped about how to prove the other direction.
Also, I would like to know what's the mistake in the following reasoning:

Since V is a complex vector space, then T in L(V) has an eigenvalue $$\lambda$$, so V has an T-invariant subspace of dimension 1. Then there exists a subspace W of V such that $$V = U \oplus W$$. It follows that W is T-invariant and of dimension dim V - 1[/tex]. So it's always possible to decompose V into two proper T-invariant proper subspaces :S

Thank you :)

Last edited: Nov 22, 2008