Eigenvalues and one on diagonal matrices 1-

Kayne
Messages
22
Reaction score
0
Hi there, I have some questions to ask about the topic eigenvalues and one on diagonal matrices

1- can a square matrix exist without eignvalues? Do there exists square matrix without eigenvectors corresponding to each of its eignvalues?

2- What is diagonalisation of a matrix, were abouts would this be useful

3 - When solving a eigenvalue what has to be found and what is given?

Thanks for any information

Kayne
 
Physics news on Phys.org


Kayne said:
Hi there, I have some questions to ask about the topic eigenvalues and one on diagonal matrices

1- can a square matrix exist without eignvalues? Do there exists square matrix without eigenvectors corresponding to each of its eignvalues?

2- What is diagonalisation of a matrix, were abouts would this be useful

3 - When solving a eigenvalue what has to be found and what is given?

Thanks for any information

Kayne
These look to me like the kind of very elementary homework questions that would be asked to see if you are reading the textbook at all! Perhaps I am wrong but if so, then why do you want to know these things? Are you taking a couse in linear algebra?
 


HallsofIvy,

I am studying linear algbra at Uni and seeing that I have not done this for a long time this is my reasoning for asking these question. As i don't think the answer I gave were correct. These answers are below.


1- No, if there were eigenvector v1(t,t,0) v2(s,-s,r)then using there dot product it will be seen that they are othogional V1.V2=(t,t,0).(s,-s,r) =ts-ts+0r=0
The eigenvector corrosponds to different eigenvalues which are always orthognal.


2- Diagonalisation of a matrix is when a nxn Matrix prossess a set of n linearly independaent eignvectors. The Matrix P whose n coloums are linearly independant eigenvector of nxn matrix will be diagonalise A. All Symmetrical matrices are diagonasable, how ever not all Matrices are. These can help in solving/smplfying difficult DE problems like x''+3x'+2x for (x)t or a set of DE from a numerical model of Heat or Chemical dispersal llike

y1=-y1+y2
y2=y1-2y2+y3
y3=y2-2y3+y4
Etc

To diadonalise the above would look like then you would solve the Matrix
[-1,1,0,0,0]
[1,-2,1,0,0]
[0,1,-2,1,0]


3 - That an eigenvalue is goin to be the stretching factor of Matrix A, this is becuase Av=(lamaba)v where (lamaba) is te eigen value. The sort after part of this would be when or how much is the Matrix A going to stretch before faliure occurs.


If you can let me know if what i have written is correct then i haven't forgotten too much and i might be on the right track still.

Thanks for your time

Kayne
 


Kayne said:
HallsofIvy,

I am studying linear algbra at Uni and seeing that I have not done this for a long time this is my reasoning for asking these question. As i don't think the answer I gave were correct. These answers are below.


1- No, if there were eigenvector v1(t,t,0) v2(s,-s,r)then using there dot product it will be seen that they are othogional V1.V2=(t,t,0).(s,-s,r) =ts-ts+0r=0
The eigenvector corrosponds to different eigenvalues which are always orthognal.
The eigenvectors corresponding to different eigenvalues are always orthogonal but your original question was "1- can a square matrix exist without eignvalues? Do there exists square matrix without eigenvectors corresponding to each of its eignvalues?"
The characteristic equation of an n by n matrix is an n degree polynomial equation. Such a equation always has roots over the complex numbers but may not over the real numbers. For example, if we are thinking of
\begin{bmatrix}0 &amp; 1 \\ -1 &amp; 0\end{bmatrix}[/itex]<br /> as &quot;over the real numbers&quot;, the roots of the characteristic equation, \lambda^2+ 1= 0, i and -i, would not count as eigenvalues. If we tried to find eigenvectors corresponding to those, they would have complex components.<br /> <br /> <blockquote data-attributes="" data-quote="" data-source="" class="bbCodeBlock bbCodeBlock--expandable bbCodeBlock--quote js-expandWatch"> <div class="bbCodeBlock-content"> <div class="bbCodeBlock-expandContent js-expandContent "> 2- Diagonalisation of a matrix is when a nxn Matrix prossess a set of n linearly independaent eignvectors. </div> </div> </blockquote> The wording seems awkward. A matrix <b>is</b> diagonalizable if and only if that is true.<br /> <br /> <blockquote data-attributes="" data-quote="" data-source="" class="bbCodeBlock bbCodeBlock--expandable bbCodeBlock--quote js-expandWatch"> <div class="bbCodeBlock-content"> <div class="bbCodeBlock-expandContent js-expandContent "> The Matrix P whose n coloums are linearly independant eigenvector of nxn matrix will be diagonalise A. All Symmetrical matrices are diagonasable, how ever not all Matrices are. These can help in solving/smplfying difficult DE problems like x&#039;&#039;+3x&#039;+2x for (x)t or a set of DE from a numerical model of Heat or Chemical dispersal llike<br /> <br /> y1=-y1+y2<br /> y2=y1-2y2+y3<br /> y3=y2-2y3+y4<br /> Etc <br /> <br /> To diadonalise the above would look like then you would solve the Matrix<br /> [-1,1,0,0,0]<br /> [1,-2,1,0,0]<br /> [0,1,-2,1,0] </div> </div> </blockquote> Another nice thing about diagonal matrices is that it is easy to find a power: If A is diagonalizable, the A<sup>n</sup> is just the diagonal matrix with the nth power of the diagonal of A as diagonal. And that makes it easy to apply any &quot;analytic&quot; function to a diagonalizable matrix. An analytic function is any function whose Taylors series exists and converges to that function. For example e^x= \sum_{n=0}^\infty x^n/n! so we can use that to define e<sup>A</sup> for matrix A. And in the case of a diagonal matrix that is just the matrix having e<sup>x</sup> along the diagonal where x is a diagonal element of A.<br /> <br /> <br /> <blockquote data-attributes="" data-quote="" data-source="" class="bbCodeBlock bbCodeBlock--expandable bbCodeBlock--quote js-expandWatch"> <div class="bbCodeBlock-content"> <div class="bbCodeBlock-expandContent js-expandContent "> 3 - That an eigenvalue is goin to be the stretching factor of Matrix A, this is becuase Av=(lamaba)v where (lamaba) is te eigen value. The sort after part of this would be when or how much is the Matrix A going to stretch before faliure occurs. </div> </div> </blockquote> Pretty good. And the corresponding eigenvector is the direction of stretch.<br /> <br /> <br /> <blockquote data-attributes="" data-quote="" data-source="" class="bbCodeBlock bbCodeBlock--expandable bbCodeBlock--quote js-expandWatch"> <div class="bbCodeBlock-content"> <div class="bbCodeBlock-expandContent js-expandContent "> If you can let me know if what i have written is correct then i haven&#039;t forgotten too much and i might be on the right track still.<br /> <br /> Thanks for your time <br /> <br /> Kayne </div> </div> </blockquote>
 


HallsofIvy,

Thanks for your information i will read though it more thoughly this arvo, If i have any more question i will post in the next few days.

Thanks for your time,

Kayne
 


HallsofIvy,

Another one on Matrices.

If I had to let W be the set of all Symmetric 3x3 matrices. Determine whether or not W is a subspace of the vector space of all 3x3 matrices.


The Definition I read states that, A nonempty subset W of vectors of a vector space V is called a subspace of V when is closed under addition and scalar multiplication that is

1.if U and V are in W, then so is U+V in W
2. if U is in W and C is any number (scalar) then so is CU in W

So does this mean that the question is True only for Symmetrical Matrices?

Thanks kayne
 


The "question" was "Is the set of all 3 by 3 symmetric matices a subspace of the space of all 3 by 3 matrices". I don't know what you mean by "true only for symmetric matrices". You are correct about the requirements for a subspace. If you can show they are both true for 3 by 3 symmetric matrices then the and answer to the question is "yes". If they are not both true, the answer to the question is "no". Have you done that?
 


Hi there ! It is about a quantum wire suspended between two leads:One dimensional electron systems (e.g. quantum wires or carbon nanotubes) exhibit different behaviour. They form a Tomonaga-Luttinger liquid with bosonic excitations. Electron transport in one dimensional systems takes place through tunneling events treated in second quantisation. Nonequilibrium transport is exhibited in quantum ratchet systems which generate a tunneling current in the presence of dissipation. An irradiated quantum wire attached to leads is a concrete experimental device of a quantum ratchet when electrons interact with external gates through Rashba spin orbit interaction. It then generates a spin current where the time averaged current density depends on the amplitude of the irradiation. The eigenstates of the ratchet Hamiltonian are delocalised Bloch states or Bloch spinors. For periodic boundary conditions the position operator in the Bloch basis yields with periodicity a discrete set of Bloch states. It is then possible to introduce a tight binding model with nearest neighbour couplings using the discretised Bloch basis or discrete variable representation (DVR). However for a quantum wire of finite length open boundary conditions are considered which do not yield eigenvalues like the ones for periodic boundary conditions.

My problem to this theme is:
If they are completely different then the eigenstates do not form a basis valid for a tight binding description. On the other hand one can assume that the behaviour of the eigenvalues is similar for a large length near the middle and that it deviates only at the end of the length. I would like to verify this numerically and I am looking for people to help me with the problem.
 


I cannot imagine what you think this has to do with the questions previously asked in this thread. Please do not "hijack" other peoples threads to ask a separate question. Post your own question in its own thread.
 
Back
Top