Proving T is a Scalar Multiple of the Identity Operator in Linear Algebra

evilpostingmong
Messages
338
Reaction score
0

Homework Statement


Suppose S, T\inthe set of transformations from V to V is such
that every subspace of V with dimension dimV-1 is invariant under T.
Prove that T is a scalar multiple of the identity operator.

Homework Equations



T=\lambdaI

The Attempt at a Solution


u\inU U\subsetV
dimV=k dimU=k-1
I\lambdau=Tu
Tu-I\lambdau=0
Since Iu=u
and Tu=\lambdau
Tu=TIu=I\lambdau
So TIu=I\lambdau
with T=\lambda
 
Physics news on Phys.org
You are assuming what you want to prove and then going in circles. Be concrete. Let dim(V)=n and pick a basis {v1,v2,...,vn}. Write e.g. T(v1)=a1*v1+a2*v2+...+an*vn. Can you show a2=0, a3=0, ... an=0? Consider the subspaces of V spanned by removing one of the v's from the basis. If you can then you can show T of any basis element is a multiple of itself. Now can you show the multiplication factor for any basis element is the same?
 
Dick said:
You are assuming what you want to prove and then going in circles. Be concrete. Let dim(V)=n and pick a basis {v1,v2,...,vn}. Write e.g. T(v1)=a1*v1+a2*v2+...+an*vn. Can you show a2=0, a3=0, ... an=0? Consider the subspaces of V spanned by removing one of the v's from the basis. If you can then you can show T of any basis element is a multiple of itself. Now can you show the multiplication factor for any basis element is the same?

Okay I'll use your basis {v1,v2,...,vn}
Choose v in V.
v=c1v1+...+cnvn
I'll use T(v1)=T(c1v1)+...+T(cnvn)
Consider a subspace U1 of V and v1 in U1
For T to be a multiple of v1 we need
T(v1)=\lambdac1v1+0*c2v2+...+0*cnvn
since we want to map from V to U1 and not U2 etc
we need a multiple of v1 since Tv2 etc is not in U1
So v2+...+vn is in nullT of v1
T(v1)=\lambdac1v1+0+...+0
T(v1)=\lambdac1v1
Since Iv1=v1
T(Iv1)=\lambdaIv1
 
Last edited:
That makes no sense whatsoever. Take the case n=3. T(v1)=c1*v1+c2*v2+c3*v3. v1 is in the span of {v1,v2} and that 2 dimensional subspace is invariant. So T(v1) is in the span of {v1,v2}. What does that tell you about c3? Why? Try not to just write a bunch of gibberish.
 
c3 should be 0 because v3 is not in {v1, v2} but 0 is. So 0*v3=0
and c1v1+c2v2+0=c1v1+c2v2 in the span {v1, v2}
So to go from {v1, v2} back to itself
with T(v1)=c1v1+c2v2+c3v3 with c3=0 since
c3v3 is not in in the span of {v1, v2} if c3=/=0
 
Two specific examples of things that make no sense are these:
with T= \lambda
T is a transformation, and \lambda is an eigenvalue, a number. The two are incomparable.
For T to be a multiple of v1 we need
v1 is a vector. T operates on vectors, but isn't a vector, so can't be a scalar multiple of a vector.
 
evilpostingmong said:
c3 should be 0 because v3 is not in {v1, v2} but 0 is. So 0*v3=0
and c1v1+c2v2+0=c1v1+c2v2 in the span {v1, v2}
So to go from {v1, v2} back to itself
with T(v1)=c1v1+c2v2+c3v3 with c3=0 since
c3v3 is not in in the span of {v1, v2} if c3=/=0

That's not super clear, but you have the right idea. T(v1) is in the span. The span of {v1,v2} is the set of all vectors a*v1+b*v2 and {v1,v2,v3} is a basis. So c3=0. Now think about the invariant subspace spanned by {v1,v3}.
 
Last edited:
Dick said:
That's not super clear, but you have the right idea. T(v1) is in the span. The span of {v1,v2} is the set of all vectors a*v1+b*v2 and {v1,v2,v3} is a basis. So c3=0. Now think about the invariant subspace spanned by {v1,v3}.
For {v1,v3}
T(v3)=a1v1+a2v2+a3v3+a4v4
with a2 and a4=0
T(v3)=a1v1+0+a3v3+0=a1v1+a3v3 in {v1, v3}
Lets give the proof another shot.
We use the basis {v1...vn-1}
T(v)=a1v1+...+an-1vn-1+anvn
Setting this equation to 0
0=a1v1+...+an-1vn-1+anvn
-anvn=a1v1+...+an-1vn-1
knowing that anvn\notin{v1...vn-1}
an=0
so 0=a1v1+...+an-1vn-1
since the elements of any basis are linear independent
we can only have every "a" be equal to zero to get
a1vn+...+an-1vn-1 to be=0
So each T(vi)=aivi with ai=0
Thus a1...an-1=0
So T(v1)+...+T(vn-1)
=0*(v1)+...+0*(vn-1)
=0*(v1+...+vn-1)
 
Last edited:
evilpostingmong said:
Lets give it another shot.
We use the basis {v1...vn-1}
T(v)=a1v1+...+an-1vn-1+anvn
Setting this equation to 0
0=a1v1+...+an-1vn-1+anvn
-anvn=a1v1+...+an-1vn-1
knowing that anvn\notin{v1...vn-1}
an=0
so 0=a1v1+...+an-1vn-1
since the elements of any basis are linear independent
we can only have every "a" be equal to zero to get
a1vn+...+an-1vn-1 to be=0
So each T(vi)=aivi with ai=0
Thus a1...an-1=0
So T(v1)+...+T(vn-1)
=0*(v1)+...+0*(vn-1)
=0*(v1+...+vn-1)

What is 'v'?? Why do you think you can set T(v)=0?? You are way off track.
 
  • #10
evilpostingmong said:
For {v1,v3}
T(v3)=a1v1+a2v2+a3v3+a4v4
with a2 and a4=0
T(v3)=a1v1+0+a3v3+0=a1v1+a3v3 in {v1, v3}

Stick with n=3. And stick with T(v1)=c1*v1+c2*v2+c3*v3. You've already shown c3=0 by considering the invariant subspace spanned by {v1,v2}. I hope. If not think about it again. Now consider the invariant subspace spanned by {v1,v3}.
 
  • #11
Dick said:
Stick with n=3. And stick with T(v1)=c1*v1+c2*v2+c3*v3. You've already shown c3=0 by considering the invariant subspace spanned by {v1,v2}. I hope. If not think about it again. Now consider the invariant subspace spanned by {v1,v3}.

Oops, lol my mistake. The main space is only of dim3, v4 doesn't even exist here.
Thats what you mean by n=3, right? One question. T(v1)=a1v1+a2v2+a3v3
and we want to go from {1,3} to {1,3}.Though it is obvious that a2=0 since v2
is not in{1,3}, we are only dealing with v1. What I mean is that
T(v1) should be a1*v1 so shouldn't a3 be also 0? This will help me
when I attempt the proof again (yeah I don't quit lol).
 
  • #12
Yes, that's what I mean by n=3. If you can really understand how the proof works for n=3 then you should be able to do it for any n. Sure the invariance of span{v1,v2} implies c3=0 and the invariance of span{v1,v3} implies c2=0. So T(v1)=c1*v1. There is nothing special about v1 vs v2 or v3. So clearly, T(v2)=c2*v2 and T(v3)=c3*v3. Don't attempt the proof again until you really get this. Otherwise you'll just post a sequence of random symbols, if past experience is any judge. Now you have to show c1=c2=c3. Big hint: span{v1+v2,v3} is an invariant subspace, since it has dimension 2.
 
  • #13
Dick said:
Yes, that's what I mean by n=3. If you can really understand how the proof works for n=3 then you should be able to do it for any n. Sure the invariance of span{v1,v2} implies c3=0 and the invariance of span{v1,v3} implies c2=0. So T(v1)=c1*v1. There is nothing special about v1 vs v2 or v3. So clearly, T(v2)=c2*v2 and T(v3)=c3*v3. Don't attempt the proof again until you really get this. Otherwise you'll just post a sequence of random symbols, if past experience is any judge. Now you have to show c1=c2=c3. Big hint: span{v1+v2,v3} is an invariant subspace, since it has dimension 2.

Alright just to be sure that I post something that makes sense
:-p am I seeing this clearly: we have the basis {v1, v3} a
basis for some subspace for some space with basis {v1,v2,v3}
We take T(v1)=c1v1+c2v2+c3v3. c2=0 which we know.
But I would like to know if this is right: c3=0 because
T(v1)=c1v1. T is only acting on v1. So we get T(v1)=T(v1)+T(v2)+T(v3)
=c1v1+c2v2+c3v3=c1v1+0*v2+0*v3
=c1v1+0+0=c1v1
 
  • #14
Ack! No, c3=0 BECAUSE T(v1) is in the span of {v1,v2} and span{v1,v2} is invariant since it's dimension=2. Not c3=0 BECAUSE T(v1)=c1*v1. We don't know that yet. You keep interchanging premise with consequence. Proofs mean you go from what you know to what you don't know. Not the reverse. No offence, but you seem to be kind of tone-deaf about this.
 
  • #15
Ok for the span {v1, v3}
T(v1)=c1v1+c2v2+c3v3
c2=0 since v2 is not in {v1, v3}
which leaves T(v1)=c1v1+0+c3v3
T(v1)=T(v1)+T(v3)
T(v1)-T(v1)=T(v3)
0=T(v3)
0=c3v3
since v3 is a member of the basis, it=/=0
so c3=0
I'm not sure about this one, so I won't continue it
without reassurance.
 
  • #16
evilpostingmong said:
Ok for the span {v1, v3}
T(v1)=c1v1+c2v2+c3v3
c2=0 since v2 is not in {v1, v3}
which leaves T(v1)=c1v1+0+c3v3

You are sort of ok this far. The language is pretty informal and you aren't giving all the justifications, but you gave a reason WHY c2=0. Good.

T(v1)=T(v1)+T(v3)
T(v1)-T(v1)=T(v3)
0=T(v3)
0=c3v3
since v3 is a member of the basis, it=/=0
so c3=0
I'm not sure about this one, so I won't continue it
without reassurance.

Wrong. You changed c1*v1 into T(v1) and c3*v3 into T(v3) without giving any reason why. Because there is no reason why. You just changed it. Go back to before this misstep and tell me why c3=0. And don't write down ANYTHING without saying 'because' and giving a valid reason.
 
  • #17
Dick said:
You are sort of ok this far. The language is pretty informal and you aren't giving all the justifications, but you gave a reason WHY c2=0. Good.



Wrong. You changed c1*v1 into T(v1) and c3*v3 into T(v3) without giving any reason why. Because there is no reason why. You just changed it. Go back to before this misstep and tell me why c3=0. And don't write down ANYTHING without saying 'because' and giving a valid reason.

Okay. Let's start from there.
T(v1)=c1v1+0+c3v3
Now mapping from {v1, v2} to {v1,v2}
T(v1)=c1v1+c2v2+c3v3
we found that c2v2=0 since c2=0
and because c3v3 is not within the span {v1,v2}
as it is not a linear combination of its vectors
unless c3=0, c3 must be zero.
Now, we found that c2 and c3 are both zero.
So T(v1)=c1v1+0v2+0v3=c1v1
Not done yet, just need to know if this is right.
 
  • #18
evilpostingmong said:
Okay. Let's start from there.
T(v1)=c1v1+0+c3v3
Now mapping from {v1, v2} to {v1,v2}
T(v1)=c1v1+c2v2+c3v3
we found that c2v2=0 since c2=0
and because c3v3 is not within the span {v1,v2}
as it is not a linear combination of its vectors
unless c3=0, c3 must be zero.
Now, we found that c2 and c3 are both zero.
So T(v1)=c1v1+0v2+0v3=c1v1
Not done yet, just need to know if this is right.

Ok, I'll buy that.
 
  • #19
Dick said:
Ok, I'll buy that.
Okay, we're moving somewhere.
:smile:
For convenience I will copy paste the second part of the proof and
continue from there.

T(v1)=c1v1+0+c3v3
Now mapping from {v1, v2} to {v1,v2}
T(v1)=c1v1+c2v2+c3v3
we found that c2v2=0 since c2=0
and because c3v3 is not within the span {v1,v2}
as it is not a linear combination of its vectors
unless c3=0, c3 must be zero.
Now, we found that c2 and c3 are both zero.
So T(v1)=c1v1+0v2+0v3=c1v1
Now we take T(v2)=c1v1+c2v2+c3v3
map from {v2, v3} to {v2, v3}
c2v2+c3v3 is a linear combination in {v2,v3}
Since we have found that T(v1)=c1v1
and c1v1 is not within {v2,v3}, c1=0 for the
same reasons as c2 and c3 are.
Thus T(v2)=0*v1+c2v2+0*v1
T(v2)=c2v2
Since c2=0, T(v2)=0*v2=0
T(v2)=0
Now that we found that c1=0,
T(v1)=c1*v1
T(v1)=0*v1
T(v1)=0
Now mapping from {1,3} to {1,3}
T(v3)=c1v1+c2v2+c3v3
Since we have shown that c1 and c2 are 0,
T(v3)=0*v1+0*v2+c3v3
T(v3)=c3v3
Since we have shown that c3=0
T(v3)=0*v3
T(v3)=0
So T(v1)=c1v1+c2v2+c3v3
T(v1)=T(v1)+T(v2)+T(v3)
Since T(v1),T(v2),T(v3) all =0
T(v1)=T(v1+v2+v3)
T(v1)=0*(v1+v2+v3)
can be shown for the others
 
  • #20
No, no, no. When you write T(v1)=c1*v1+c2*v2+c3*v3 and T(v2)=c1*v1+c2*v2+c3*v3 there is no implication that the c1 in T(v1) is the same as the c1 in T(v2). Otherwise, we'd be assuming T(v1)=T(v2). And we can't assume that. If it's causing too much confusion write T(v2)=d1*v1+d2*v2+d3*v3 and tell me what you can conclude about the d's.
 
  • #21
T(v2)=d1v1+d2v2+d3v3
mapping from {v1, v2} to {v1,v2}
T(v2)=d1v1+d2v2+0*v3 since v3 is not in {v1,v2} but 0 is.
T(v2)=d1v1+d2v2
mapping from {v2,v3} to {v2,v3}
T(v2)=d1v1+d2v2+d3v3
Since d1v1 is not in {v2, v3} d1=0
T(v2)=0+d2v2+d3v3
So T(v2)=0*v1+d2v2+0*v3
T(v2)=d2v2
T(v3)=f3v3 using the same method as before.
Now choose T(v)=T(v1)+T(v2)+T(v3)
since we don't know whether or not T(v1)=c1v1+d2v2+f3v3
mapping from {v1, v2} to {v1, v2} f3=0
T(v)=T(v1)+T(v2)+0*v3 now T(v3)=0
how is it so far?
 
  • #22
evilpostingmong said:
T(v2)=d1v1+d2v2+d3v3
mapping from {v1, v2} to {v1,v2}
T(v2)=d1v1+d2v2+0*v3 since v3 is not in {v1,v2} but 0 is.
T(v2)=d1v1+d2v2
mapping from {v2,v3} to {v2,v3}
T(v2)=d1v1+d2v2+d3v3
Since d1v1 is not in {v2, v3} d1=0
T(v2)=0+d2v2+d3v3
So T(v2)=0*v1+d2v2+0*v3
T(v2)=d2v2
T(v3)=f3v3 using the same method as before.
Now choose T(v)=T(v1)+T(v2)+T(v3)
since we don't know whether or not T(v1)=c1v1+d2v2+f3v3
mapping from {v1, v2} to {v1, v2} f3=0
T(v)=T(v1)+T(v2)+0*v3 now T(v3)=0
how is it so far?

Getting there. So far you have T(v1)=c*v1, T(v2)=d*v2, T(v3)=f*v3. None of c,d or f need to be zero. Stop trying to prove they are. Go back and look at what you are trying to prove. Would you agree that what you need to prove now is that c=d=f? Don't agree if you don't know why.
 
  • #23
Dick said:
Getting there. So far you have T(v1)=c*v1, T(v2)=d*v2, T(v3)=f*v3. None of c,d or f need to be zero. Stop trying to prove they are. Go back and look at what you are trying to prove. Would you agree that what you need to prove now is that c=d=f? Don't agree if you don't know why.


Oh okay that makes life easier. We want c=d=f so that we have an eigenvalue
to factor out of the linear combination T(v1)+T(v2)+T(v3)=c1v1+d2v2+f3v3
=c1(v1+v2+v3) or d2 or f3.
I'm going to work now, I'll be back later tonight, I'll check back with you.
Don't want you to have to keep poking back here and finding I'm offline, lol.
 
  • #24
When you come back, there's a big hint to finishing in post 12. In case you've forgotten.
 
  • #25
Dick said:
When you come back, there's a big hint to finishing in post 12. In case you've forgotten.

You mean {v1+v2, v3} right? Ugh, I'm slow this morning, got to make use of this set.
Here's our arsenal: {v1+v2, v3}
T(v1)=c1v1 T(v2)=d2v2 T(v3)=f3v3
Wait! {v1, v2} is without v3 {v1, v3} is without v2 {v2, v3} is without v1.
But within {v1+v2, v3} is T(v1+v2)=T(v1)+T(v2) and T(v3)
mapping from {v1+v2, v3} to itself
T(v1+v2)+T(v3)=k1(v1+v2)+k2v3
note that v1+v2 has the common scalar otherwise we would be able to
have k1v1+0*v2 which is not in the span.
But since T(v1+v2)=T(v1)+T(v2) then the constants equate.
Since T(v1)=c1v1 and T(v2)=d2v2 c1=k1 and d2=k1. c1=d2
mapping from {v1, v2+v3}
T(v1)+T(v2+v3)=j1v1+j2(v2+v3)
now T(v2+v3)=T(v2)+T(v3)
but T(v2+v3)=j2(v2+v3)=j2(v2)+j2(v3)=T(v2)+T(v3)
Since T(v2)=j2v2=k1v2=d2v2=c1v2 and T(v3)=j2v3
with T(v3) sharing its constant with T(v2),
its constant j2 must be=k1=d2=c1.
So c1=d2=f3
 
  • #26
That's basically it. Since {v1+v2,v3} is invariant, T(v1+v2)=k1*(v1+v2)+k2*v3 and T(v1+v2)=c1*v1+d2*v2. There's no need to involve T(v3) as you did with "T(v1+v2)+T(v3)=k1(v1+v2)+k2v3". Well done.
 
  • #27
Dick said:
That's basically it. Since {v1+v2,v3} is invariant, T(v1+v2)=k1*(v1+v2)+k2*v3 and T(v1+v2)=c1*v1+d2*v2. There's no need to involve T(v3) as you did with "T(v1+v2)+T(v3)=k1(v1+v2)+k2v3". Well done.
WOHOO!:cool::smile:
 
Back
Top