Understanding Eigenvalues in Matrix Construction from Differential Equations

  • Thread starter hotcommodity
  • Start date
  • Tags
    Eigenvalues
In summary, constructing a 2 X 2 matrix from a second-order differential equation involves turning it into a system of first-order linear equations and finding its eigenvalues. When swapping the columns of the matrix, two different characteristic polynomials and sets of eigenvalues are obtained. To determine which variable should correspond to which column, the rule of thumb is to let the first column correspond to the variable in the original equation and the second column correspond to its derivative. To place the column swap matrix C into the equation Ax = y without losing validity, one can right-multiply A by C and then right-multiply the resulting matrix by C's inverse. This works because C*C = I, the 2 X 2 identity matrix, and
  • #1
hotcommodity
436
0
Let's say that I have to construct a 2 X 2 matrix from a second-order differential equation, turning it into a system of first order linear equations, and find its eigenvalues. I'll have two variables that correspond to the two columns in the matrix.

If I swap columns, I end up with two different characteristic polynomials, and thus different sets of eigenvalues (this is the problem). How do I know which variable should correspond to the first column, and which variable should correspond to the second column?
 
Physics news on Phys.org
  • #2
If you switch the columns, you switch the order of the variables.
 
  • #3
You spent a course learning linear arithmetic, so use it. :wink:

Swapping columns is a matrix operation: if you started with the equation

[tex]A x = y[/tex]

and you want to replace A with AC, where C is the column swap elementary matrix... i.e.

[tex]
C = \left(
\begin{array}{cc}
0 & 1 \\
1 & 0
\end{array}
\right)
[/tex]

Can you think of any way to modify the equation Ax=y (without changing its validity!) to make a C appear somewhere in it? (Preferably just to the right of the A)
 
  • #4
Thanks for the replies :)

Hurkyl said:
You spent a course learning linear arithmetic, so use it. :wink:

Swapping columns is a matrix operation: if you started with the equation

[tex]A x = y[/tex]

and you want to replace A with AC, where C is the column swap elementary matrix... i.e.

[tex]
C = \left(
\begin{array}{cc}
0 & 1 \\
1 & 0
\end{array}
\right)
[/tex]

Can you think of any way to modify the equation Ax=y (without changing its validity!) to make a C appear somewhere in it? (Preferably just to the right of the A)

My math professor helped me out this morning, and I found out that if I have a second order differential equation corresponding to an equation y(t), and I let dy/dt = v, then the "rule of thumb" is to let the coefficient matrix A have its first column correspond to the variable y and let the second column correspond to the derivative of y(t), v (this way I know which coefficients in A to subtract [tex]\lambda[/tex] from). But now I want to know what you're talking about! So I'll take a shot at your question :)

I believe your question is, how can I place the column swap matrix C into the equation Ax = y, without the equation losing its validity. To do this, I believe I would multiply both sides of Ax = y by C, such that C appears between A and x, and to the left of y. So I would have ACx= Cy. I worked this out on paper and I see that this does indeed swap the columns in A and the rows in y, which wouldn't change my answer, so it works out. Please let me know if I'm mistaken.
 
  • #5
hotcommodity said:
I believe your question is, how can I place the column swap matrix C into the equation Ax = y, without the equation losing its validity. To do this, I believe I would multiply both sides of Ax = y by C, such that C appears between A and x, and to the left of y. So I would have ACx= Cy. I worked this out on paper and I see that this does indeed swap the columns in A and the rows in y, which wouldn't change my answer, so it works out. Please let me know if I'm mistaken.
That doesn't quite work -- remember that the operations you have available are "left multiply" and "right multiply"... so you can't just insert it anywhere you want.

The method I was hinting at is this idea: If I want to right-multiply A by C, I can undo that by right-multiplying by C-inverse. In this case, the inverse of C is itself, so I have the following derivation:

Ax = y
A (CC) x = y
(AC) (Cx) = y

The procedure you used shouldn't work in general: if Ax=y, then usually ACx=Cy will be false. In fact, that implies:

ACx = Cy
ACx = C(Ax)
ACx = CAx
(AC - CA) x = 0

so it will only work when the solution for x happens to be a nullvector of (AC - CA).
 
  • #6
Hurkyl said:
Ax = y
A (CC) x = y
(AC) (Cx) = y

I see, so this works out because C*C = I, where I would be the 2 X 2 identity matrix, and AI = A.
 
  • #7
hotcommodity said:
I see, so this works out because C*C = I, where I would be the 2 X 2 identity matrix, and AI = A.
Right. And if you did a more complicated column operation to A, you can hopefully work out what happens to x to neutralize it.
 

What are eigenvalues?

Eigenvalues are numbers that represent the scaling factor of a given matrix when it is multiplied by a corresponding eigenvector. They are important in many areas of mathematics and science, including linear algebra, differential equations, and quantum mechanics.

Why are eigenvalues important?

Eigenvalues have many important applications, such as determining stability in differential equations, analyzing patterns in data, and solving quantum mechanical systems. They also have practical applications in fields such as engineering and economics.

How do you find eigenvalues?

To find eigenvalues, you need to solve a characteristic equation for the given matrix. This involves subtracting the eigenvalue from the main diagonal and taking the determinant of the resulting matrix. The resulting values are the eigenvalues for that matrix.

Can a matrix have complex eigenvalues?

Yes, a matrix can have complex eigenvalues. This often occurs when dealing with systems that involve imaginary or complex numbers, such as in quantum mechanics or electrical engineering.

What is the relationship between eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are closely related. The eigenvalues represent the scaling factor of the eigenvectors, meaning that multiplying an eigenvector by its corresponding eigenvalue will result in a new vector that is parallel to the original eigenvector. Additionally, the eigenvectors form a basis for the vector space of the matrix, meaning that any vector in that space can be represented as a linear combination of the eigenvectors.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
522
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
385
  • Calculus and Beyond Homework Help
Replies
2
Views
521
  • Calculus and Beyond Homework Help
Replies
1
Views
277
  • Calculus and Beyond Homework Help
Replies
3
Views
569
  • Calculus and Beyond Homework Help
Replies
24
Views
793
  • Calculus and Beyond Homework Help
Replies
19
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
2
Replies
41
Views
4K
Back
Top