(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Let V be a vector space over a field F and let L(V) be the vector space of linear transformations from V to V. Suppose that T is in L(V). Do not assume that V is finite-dimensional.

a) Prove that T^2 = -T iff T(x) = -x for all x in R(T).

b) Suppose that T^2 = -T. Prove that the intersection of N(T) and R(T) = {0}.

2. Relevant equations

3. The attempt at a solution

The T^2 is throwing me off slightly. Does it just mean take the square of the original linear transformation T?

So if T^2 = -T,

T^2 (x) = -T(x)

Also rank(T^2) = rank (-T)

That's all I have so far. How can I approach this if I can't assume that V is finite-dimensional?

If I could get a few tips on how to start this question, it would be very helpful :)

1. The problem statement, all variables and given/known data

2. Relevant equations

3. The attempt at a solution

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Linear transformation from V to V proof

**Physics Forums | Science Articles, Homework Help, Discussion**