Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Decomposition of a complex vector space into 2 T-invariant subspaces

  1. Nov 22, 2008 #1
    1. The problem statement, all variables and given/known data
    Suppose V is a complex vector space and [tex]T \in L(V)[/tex]. Prove that
    there does not exist a direct sum decomposition of V into two
    proper subspaces invariant under T if and only if the minimal
    polynomial of T is of the form [tex](z - \lambda)^{dim V}[/tex] for some [tex]\lambda \in C[/tex].

    2. Relevant equations



    3. The attempt at a solution

    First suppose that the minimal polynomial of T is [tex]p(z) = (z - \lambda)^{n}[/tex] where n = dim V. Suppose further that [tex]V = U \oplus W[/tex] where U and W are T-invariant proper subspaces of V. So [tex]dim U \geq 1[/tex] and [tex]dim W \geq 1[/tex]. Since n = dim U + dim W, we have that [tex]dim U \leq n-1[/tex] and [tex]dim W \leq n-1[/tex].
    Now the minimal polynomial of T is [tex]p(z) = (z - \lambda)^{n}[/tex], therefore [tex](T - \lambda I)^{n}v = 0[/tex] for all v in V but there is at least one v in V such that [tex](T - \lambda I)^{n-1}v \neq 0[/tex]. Because of the decomposition of V, we can write: v = u + w for some u in U and some w in W. Applying [tex](T - \lambda I)^{n-1} [/tex] to both sides we get:

    [tex](T - \lambda I)^{n-1}v = (T - \lambda I)^{n-1}u + (T - \lambda I)^{n-1}w \neq 0[/tex]

    since both U and W are invariant under T, we have Tu in U and and [tex] -\lambda u \in U [/tex] so [tex](T - \lambda I)u \in U [/tex]. Thefore U (and similarly W) are both invariant under [tex](T-\lambda I) [/tex], so invariant under [tex](T-\lambda I)^{n-1} [/tex]. [tex](T - \lambda I)^{n-1}v \neq 0 [/tex], so either one of [tex](T - \lambda I)^{n-1}u [/tex] or [tex](T - \lambda I)^{n-1}w [/tex] is not zero, because they're in different subspaces. Assume it's the first one. So there exists u in U such that

    [tex](T - \lambda I)^{n-1}u \neq 0[/tex]. But [tex](T - \lambda I)^{n}u = 0 [/tex] (for all u in U). Constraining T to U, we see that the minimal polynomial of T on U is [tex](z - \lambda I)^{n} [/tex]. But [tex] dim U \leq n-1 < n [/tex], a contradiction.

    I hope the reasoning above is correct; if it is, it proves that minimal polyomial of T is [tex](z-\lambda I)^{n} [/tex] implies that T cannot be decomposed into the direct sum of two proper T-invariant subspaces.

    However I'm stumped about how to prove the other direction.
    Also, I would like to know what's the mistake in the following reasoning:

    Since V is a complex vector space, then T in L(V) has an eigenvalue [tex]\lambda[/tex], so V has an T-invariant subspace of dimension 1. Then there exists a subspace W of V such that [tex]V = U \oplus W[/tex]. It follows that W is T-invariant and of dimension dim V - 1[/tex]. So it's always possible to decompose V into two proper T-invariant proper subspaces :S

    Thank you :)
     
    Last edited: Nov 22, 2008
  2. jcsd
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted