Eigenvalue Proof of U if T(A)=UA

In summary: I'm sure you can do better!In summary, it was shown that if c is an eigenvalue of U, then it is also an eigenvalue of T, where T is an operator defined as T(A)=UA and U is a fixed nxn matrix. This was proven by showing that if UX=cX, then c is an eigenvalue of U and T.
  • #1
evilpostingmong
339
0

Homework Statement


Let U be a fixed nxn matrix and consider the operator T: Msub(n,n)------>Msub(n,n)
given by T(A)=UA.
Show that c is an eigenvalue of T if and only if it is an eigenvalue of U.

Homework Equations





The Attempt at a Solution


If T(A)=UA then T(A)-UA=0 (T-U)A=0.
Let v be an eigenvector of T so Tv=cv.
If v is an eigenvector[tex]\in[/tex]A then A
is not a zero matrix so for (T-U)A=0 we
have Tv-Uv=0 so cv-Uv=0
Uv=cv so c must be an eigenvalue of U for
Uv=Tv=cv and for Uv-Tv=0 for v[tex]\in[/tex]A.
 
Physics news on Phys.org
  • #2
You are going to have to rethink that whole thing. You have a concept wrong. T is an operator, not a nxn matrix. So writing something like (T-U) makes no sense at all. Furthermore an eigenvalue of T is number c such that for some nxn matrix T(X)=cX. Get it? EIgenvectors of T are actually MATRICIES.

Start it this way. If c is an eigenvalue of T, then there is a matrix X such that T(X)=cX=UX. Now you want to show c is an eigenvalue of U.
 
Last edited:
  • #3
Dick said:
You are going to have to rethink that whole thing. You have a concept wrong. T is an operator, not a nxn matrix. So writing something like (T-U) makes no sense at all. Furthermore an eigenvalue of T is number c such that for some nxn matrix T(X)=cX. Get it? EIgenvectors of T are actually MATRICIES.

Start it this way. If c is an eigenvalue of T, then there is a matrix X such that T(X)=cX=UX. Now you want to show c is an eigenvalue of U.
Knowing that T(X)-cX=0, for X is an eigenvector of T and c is an eigenvalue of T,
T(X)=cX but since T(X)=UX UX=cX therefore
UX-cX=0=T(X)-cX (this equation shows that for T(X) to be=UX, they must have the
same eigenvalues).

Yeah for that first attempt, I got T mixed up with a matrix representation of T.
 
Last edited:
  • #4
UX=cX doesn't really tell you much directly. To say c is an eigenvalue of U, you want to show there is a VECTOR v such that Uv=cv. X is a matrix. Hint: X is a nonzero matrix (why nonzero?).
 
  • #5
Dick said:
UX=cX doesn't really tell you much directly. To say c is an eigenvalue of U, you want to show there is a VECTOR v such that Uv=cv. X is a matrix. Hint: X is a nonzero matrix (why nonzero?).

X is nonzero because if T(X)=cX, and c is an eigenvalue, then X must
not be a zero matrix since eigenvectors are nonzero.

Hang on, this will get a bit weird:
Consider V of dimension n.
The basis of V is <v1...vn> with each vi a 1xn column with a single nonzero
element (vi ) at row i. If vi is the eigenvector for c, then we let all other vectors
be multiplied by 0 and add them up (direct sum) to get the nxn matrix with vi as the only nonzero,
which equates to vi. So X=vi and so T(X)=cX=T(vi)=cvi since UX=cX and cX=cvi,
UX=cvi, and since X=vi, Uvi=cvi. Therefore, Uvi-cvi=0=T(vi)-cvi
and for this to be true, Uvi must share its eigenvalue with T(vi), which it does.

Why does X=vi? Because vi is single column matrix and X is an nxn matrix
with one nonzero element.
 
Last edited:
  • #6
I think you've got the right idea but you did express it oddly. If X is a nonzero linear transformation, then there must a vector u such that Xu is nonzero. That's about all there is to it. Then v=Xu must be an eigenvector of U, right? Don't forget there is still a converse to prove.
 
  • #7
Dick said:
I think you've got the right idea but you did express it oddly. If X is a nonzero linear transformation, then there must a vector u such that Xu is nonzero. That's about all there is to it. Then v=Xu must be an eigenvector of U, right? Don't forget there is still a converse to prove.

I know, it was weird :biggrin:. I'll be back later on to tackle that part.
 
  • #8
Ok, I'm back. The converse is: if c is an eigenvalue of U, then it is also an eigenvalue of
T for UX=T(X)

Knowing that UX=cX, c is an eigenvalue so X is a nonzero nxn matrix. Proving that
vi exists in a similar manner as before, we get Uvi=cvi and since vi=X,
and T(X)=UX=cX, T(vi)=Uvi=cvi. Thus T(vi)=cvi so c is an eigenvalue of T
so T(vi)-cvi=0=Uvi-cvi.
 
  • #9
Stop with the first statement! You DON'T know UX=cX. You don't even know what X is. You are given that U has a eigenvalue c, not T. What you do know is that there is a vector v such that Uv=cv. Pay attention to the premise, ok? Now try and find an X such that UX=cX.
 
  • #10
Dick said:
Stop with the first statement! You DON'T know UX=cX. You don't even know what X is. You are given that U has a eigenvalue c, not T. What you do know is that there is a vector v such that Uv=cv. Pay attention to the premise, ok? Now try and find an X such that UX=cX.

Since c is an eigenvalue, v must be a nonzero vector. Since Uv is
an nxn matrix of one nonzero element (cv). Now, factoring out the scalar
c gives the scalar *a matrix containing v which we call X so we have cX.
Since U*v produced the nxn matrix, and cX is the result of "pullng the c out",
multiplying c and X should give the same product as Uv so Uv=cX.
Since Uv-cX=0 and Uv-cv=0 Uv-cX=0=Uv-cv and
-cX-cv=Uv-Uv so cX-cv=0 so X must=v therefore since
Uv=cv, UX=cX since v=X.

To visualize what I meant by "pulling the c out" take an nxn matrix
and put cv in an entry. Then factor out the c and we are left with
c*the matrix with v instead of cv.

That is one ugly proof :bugeye:
 
  • #11
It's not even a proof, it's gibberish. Let's focus on one part. You said v=X. You know that's nonsense, right? v is a vector and X is a matrix. Pay attention to what things are and if you say a thing of one type equals a thing of another type, recoil in horror. Don't even write it down. In what sense can a linear transformation "equal" a vector? Equal in super quotes. There is a good answer. Think really hard.
 
Last edited:
  • #12
Dick said:
It's not even a proof, it's gibberish. Let's focus on one part. You said v=X. You know that's nonsense, right? v is a vector and X is a matrix. Pay attention to what things are and if you say a thing of one type equals a thing of another type, recoil in horror. Don't even write it down. In what sense can a linear transformation "equal" a vector. Equal in super quotes. There is a good answer. Think really hard.

well in this case when Tv=jv with jv being the nxn matrix with only jv as
the nonzero which is "equivalent" to the vector jv (chose j because
we don't know if c is an eigenvalue of Tv).
 
  • #13
I'm not sure I understand what that means. But here's another hint. Suppose you could find a linear transformation X such that X acting on ANY vector u would give you a multiple of v (where Uv=cv). Is that what you are looking for? That's sort of like "X=v". If that's what you want, can you define one? Remember you can define a linear transformation by specifying it's action on a basis.
 
  • #14
Dick said:
I'm not sure I understand what that means. But here's another hint. Suppose you could find a linear transformation X such that X acting on ANY vector u would give you a multiple of v (where Uv=cv). Is that what you are looking for? That's sort of like "X=v". If that's what you want, can you define one? Remember you can define a linear transformation by specifying it's action on a basis.

Actually that was kind of what I meant by X=v. And remember when I said
at the way beginning (first post) that T-U deal? I imagined the T to be
your X or a representation as a matrix, but I didn't really specify that.
And in the v in the proof where I said v=X
v was supposed to be a single column matrix representation of v,
my fault for not pointing that out.
 
  • #15
I THOUGHT that was what you meant. But I can't read through the "Uv-cv=0 Uv-cX=0=Uv-cv " stuff you are posting as a posting as a proof. Here's one clear way to do it. "Pick a basis {v1,v2,...,vn}. Define X as a linear transformation such that X(vi)=v for all i". Do you see how that makes X an eigenvector of T with eigenvalue c? You could also say " X(v1)=v and X(vi)=0 for i>1" which I think is your single column way. I think you are thinking the right things, but your proof statements just don't convey that.
 
  • #16
Dick said:
I THOUGHT that was what you meant. But I can't read through the "Uv-cv=0 Uv-cX=0=Uv-cv " stuff you are posting as a posting as a proof. Here's one clear way to do it. "Pick a basis {v1,v2,...,vn}. Define X as a linear transformation such that X(vi)=v for all i". Do you see how that makes X an eigenvector of T with eigenvalue c? You could also say " X(v1)=v and X(vi)=0 for i>1" which I think is your single column way. I think you are thinking the right things, but your proof statements just don't convey that.

I did a similar method to that in this proof:

Hang on, this will get a bit weird:
Consider V of dimension n.
The basis of V is <v1...vn> with each vi a 1xn column with a single nonzero
element (vi ) at row i. If vi is the eigenvector for c, then we let all other vectors
be multiplied by 0 and add them up (direct sum) to get the nxn matrix with vi as the only nonzero,
which equates to vi. So X=vi and so T(X)=cX=T(vi)=cvi since UX=cX and cX=cvi,
UX=cvi, and since X=vi, Uvi=cvi. Therefore, Uvi-cvi=0=T(vi)-cvi
and for this to be true, Uvi must share its eigenvalue with T(vi), which it does.

Why does X=vi? Because vi is single column matrix and X is an nxn matrix
with one nonzero element. But for the second part, I didn't explicitly define a basis for that vector
represented by a column matrix. But, I will modify it a bit.
Okay using Dick's basis <v1...vn> we represent a vector with a column matrix
(lets use v1 as the lonely nonzero). This will be called I(v1). Now we multiply
to get UI(v1). Factoring out c1,1 gives c1,1*a nxn matrix with v1 as the
lonely nonzero which we will call X. Now UI(v1)=cX. Since I(v1) is
a matrix with one nonzero element (v1) as is the case with X,
and multiplying X by U will produce the same product as multiplying I(v1)
by U, X=I(v1) so UX=cX.
 
Last edited:
  • #17
Why are you making this so complicated? If you want to talk in terms of matrices, to get an 'eigenmatrix' X with the same eigenvalue as Uv=cv, just define each column of X to be proportional to v. So the column space of X is just span{v}. I.e.X(u)=k(u)*v where k(u) is a constant depending on u. You can just say that, I think it's the same thing you are trying to say in a more complicated, not necessarily grammatical way. But that STILL doesn't mean you get to say X=v, or X=v1. They aren't equal.
 
Last edited:
  • #18
Dick said:
Why are you making this so complicated? If you want to talk in terms of matrices, to get an 'eigenmatrix' X with the same eigenvalue as Uv=cv, just define each column of X to be proportional to v. So the column space of X is just span{v}. I.e.X(u)=k(u)*v where k(u) is a constant depending on u. You can just say that, I think it's the same thing you are trying to say in a more complicated, not necessarily grammatical way. But that STILL doesn't mean you get to say X=v, or X=v1. They aren't equal.

Whoops, I edited it. And yes it makes no sense to compare a line with a representation of a line. They are different things, its like saying an apple is a tree.

But for the second part, I didn't explicitly define a basis for that vector
represented by a column matrix. But, I will modify it a bit.
Okay using Dick's basis <v1...vn> we represent a vector with a column matrix
(lets use v1 as the lonely nonzero). This will be called I(v1). Now we multiply
to get UI(v1). Factoring out c1,1 gives c1,1*a nxn matrix with v1 as the
lonely nonzero which we will call X. Now UI(v1)=cX. Since I(v1) is
a matrix with one nonzero element (v1) as is the case with X,
and multiplying X by U will produce the same product as multiplying I(v1)
by U, X=I(v1) so UX=cX.
 
  • #19
Where v1 is the eigenvector of U, right? Say so. You mean I(v1) to be the matrix such that c1,1=1 and all of the rest of the elements are zero in the basis {v1,...,vn}, right? See? It's still confusing me. If so that's fine. I've been trying to get you to state the same thing using fewer words and symbols, but that doesn't seem to be happening. But the proof is essentially correct and that's the important part.
 
  • #20
Dick said:
Where v1 is the eigenvector of U, right? Say so. You mean I(v1) to be the matrix such that c1,1=1 and all of the rest of the elements are zero in the basis {v1,...,vn}, right? See? It's still confusing me. If so that's fine. I've been trying to get you to state the same thing using fewer words and symbols, but that doesn't seem to be happening. But the proof is essentially correct and that's the important part.

You are absolutely right. Now I realize how delicate proofs can be in that
I can make ONE mistake (ie saying that a matrix equals a vector) and that
puts an arrow through the proof's heart, even if the proof is otherwise correct.
Had I mentioned the fact that I was representing the vector in matrix form,
this thread would've been done a long time ago. And I need to use
fewer words and get to the point. This will only get better with practice,
as is any skill.

Now, what I meant is that I(v1) is a column matrix with I a column
with a 1 at 1,1 and 0 everywhere else.
This column matrix gets multiplied to U to get UI(v1).
Now, at position 1,1 of U is the eigenvalue c1,1 and after
multiplying we have c1,1*v1 at position 1,1 and 0 everywhere else
on the matrix. Now we have c1,1*I(v1) with I(v1) a nxn matrix
(it turned into one) with v1 at 1,1 and 0 everywhere else. So UI(v1)=c1,1*I(v1)
 
  • #21
Yes, yes, yes. I've been agreeing that you have the right idea all along. Just simplify it. And supply the necessary background. Like a description of the basis you using to using for the matrix, which you still aren't doing. Can't you think of a simpler way to describe a linear transformation whose image vectors are all multiples of v than that? You agreed you should use fewer words, so do it. I can do it using very few words. Do you want to see it? Though I already did it in post 15.
 
  • #22
Dick said:
Yes, yes, yes. I've been agreeing that you have the right idea all along. Just simplify it. And supply the necessary background. Like a description of the basis you using to using for the matrix, which you still aren't doing. Can't you think of a simpler way to describe a linear transformation whose image vectors are all multiples of v than that? You agreed you should use fewer words, so do it. I can do it using very few words. Do you want to see it? Though I already did it in post 15.

We have a basis <v1...vn> for 1<=i<=n (CS notation lol)
we have a column matrix I(vi) I is the identity matrix with 1 at row i.
UI(vi)=ci,i*I(vi)

How's this?
 
  • #23
You are still stuck in your particular way of seeing the proof and you are still not expressing it in a way anyone reading it could understand. Except me. Because I have a secret understanding of what you are trying to say.

Try this. Let <v1...vn> be ANY basis. Let v be the eigenvector of U, so Uv=cv. Define X(vi)=v for ANY element in the basis. Doesn't that define a tranformation X which is an eigenvector of UX=cX? Few words and unnecessary details, right? Do you follow that? Do you think another person might?
 
  • #24
Dick said:
You are still stuck in your particular way of seeing the proof and you are still not expressing it in a way anyone reading it could understand. Except me. Because I have a secret understanding of what you are trying to say.

Try this. Let <v1...vn> be ANY basis. Let v be the eigenvector of U, so Uv=cv. Define X(vi)=v for ANY element in the basis. Doesn't that define a tranformation X which is an eigenvector of UX=cX? Few words and unnecessary details, right? Do you follow that? Do you think another person might?

That makes sense, thank you!
 

1. What is an eigenvalue?

An eigenvalue is a scalar value that represents the scaling factor by which a matrix transforms an eigenvector. In other words, when a matrix is multiplied by its corresponding eigenvector, the result is a scaled version of the original eigenvector.

2. How is the eigenvalue proof of U if T(A)=UA used in science?

The eigenvalue proof of U if T(A)=UA is used in science to understand and analyze linear transformations in various fields such as physics, engineering, and economics. It helps to identify important properties of a matrix and its corresponding transformation, making it a valuable tool for solving complex problems and understanding real-world phenomena.

3. What is the importance of the eigenvalue proof of U if T(A)=UA in linear algebra?

The eigenvalue proof of U if T(A)=UA is an essential concept in linear algebra as it provides a way to decompose a matrix into its eigenvalues and eigenvectors. This decomposition allows for the simplification and analysis of complex linear transformations, making it a fundamental tool in solving systems of linear equations and understanding the behavior of matrices.

4. How is the eigenvalue proof of U if T(A)=UA related to the concept of diagonalization?

The eigenvalue proof of U if T(A)=UA is closely related to the concept of diagonalization. Diagonalization is the process of transforming a matrix into a diagonal matrix, where the entries outside of the main diagonal are all zero. This process involves finding the eigenvalues and eigenvectors of a matrix and using them to construct a diagonal matrix. Therefore, the eigenvalue proof of U if T(A)=UA is an essential step in the diagonalization process.

5. Can the eigenvalue proof of U if T(A)=UA be applied to non-square matrices?

No, the eigenvalue proof of U if T(A)=UA can only be applied to square matrices. This is because the eigenvalues and eigenvectors of a matrix are only defined for square matrices. Non-square matrices do not have the necessary properties for the eigenvalue proof to hold.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
333
  • Calculus and Beyond Homework Help
Replies
5
Views
521
  • Calculus and Beyond Homework Help
Replies
24
Views
793
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
410
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
1
Views
606
  • Calculus and Beyond Homework Help
Replies
1
Views
703
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Back
Top