How can I analyse and classify a matrix?

In summary, the conversation discussed the properties of a matrix in a Hilbert space that yields complex eigenvalues and eigenvectors and is not Hermitian. The stability and instability properties of the matrix were mentioned, along with the idea of treating the two operators as x^2 and x to solve for the quadratic equation. However, this did not provide much information. The conversation also mentioned examining the dynamic modes, separability, and condition number of the matrix, but it was noted that the condition number cannot be determined since the matrix is not normal. SVD was also briefly mentioned as a way to measure the condition number using singular values.
  • #1
SeM
Hi, I have a matrix of an ODE which yields complex eigenvalues and eigenvectors. It is therefore not Hermitian. How can I further analyse the properties of the matrix in a Hilbert space?
The idea is to reveal the properties of stability and instability of the matrix. D_2 and D_1 are the second and first order derivatives respectively, and a and b are real numbers.
I thought of treating the two operators as x² and x and solve the quadratic equation, however, this does not really give much more information.
 
Last edited by a moderator:
Mathematics news on Phys.org
  • #2
The stability properties are indicated by the position of the eigenvalues on the complex plane. A modulus greater than 1 indicates exponential growth when applied to the associated eigenvector. If an eigenvalue is complex, its argument indicates a rotation and a cyclic behavior.
 
  • Like
Likes swampwiz
  • #3
Thanks. Does this imply that there is no more classification I can actually do, given that the eigenvalue is complex, and the Matrix appears as neither hermitian, unitary, norm or Skew Hermitian?
 
  • #4
It sounds like you are looking for a name to describe your matrix. I'm not aware of any others to try.

If you want to describe the nature of the matrix, you should describe it in terms of the individual eigenvalues and associated eigenvectors. You can describe the dynamic modes of the system. You could examine if the matrix is separable (see https://en.wikipedia.org/wiki/Singular-value_decomposition ). You can also check if the matrix is ill-conditioned. If it was normal, the condition number would be the ratio of the modulus of the largest and smallest eigenvalues. But since it is not normal, I am not sure how to determine the condition number. (see https://en.wikipedia.org/wiki/Condition_number )
 
  • Like
Likes mfb and SeM
  • #5
Thanks Fastchecker, this was very clear. I have tried to derive the eigenvalues, but given that the matrix contains two numbers and two operators, hence like:

\begin{bmatrix}
D_4 & D_1 \\
i5 & 6 \\
\end{bmatrix}

where D4 is the fourth derivative and D1 is the first derivative, I end up with a secular solution but not based on numbers, but with the operators in the roots. Is this the regular procedure to solve a secular equation with operators and numbers in the elements and if it is, how can one use this result to say something about the matrix? I have only worked with numbers in the elements before, so I am no sure here. Sorry!

In this case there are parts of the secular equation solution which look like:

D4 (x_1x_2), so the fourth derivative of the two eigenvector coordinates multiplied by one another. Does one treat that as D4x^2 , which is 0?

Thanks
 
Last edited by a moderator:
  • #6
FactChecker said:
...(see https://en.wikipedia.org/wiki/Singular-value_decomposition ). You can also check if the matrix is ill-conditioned. If it was normal, the condition number would be the ratio of the modulus of the largest and smallest eigenvalues. But since it is not normal, I am not sure how to determine the condition number. (see https://en.wikipedia.org/wiki/Condition_number )

You mentioned SVD -- just need to make the connection here. As long as we're using an L2/ euclidean norm for measuring variations, use singular values to measure condition number. In the special case of a normal matrix, your singular values map exactly to magnitudes of eigenvalues. In general for square matrices, your largest singular value is always ##\geq ## magnitude of largest eigenvalue and smallest singular values is always ##\leq## smallest eigenvalues magnitude. (why?)
 
Last edited:
  • Like
Likes mfb and FactChecker

Related to How can I analyse and classify a matrix?

1. How do I determine the dimensions of a matrix?

To determine the dimensions of a matrix, you need to count the number of rows and columns. The number of rows is the first number and the number of columns is the second number in the matrix notation. For example, a matrix with 3 rows and 4 columns would be represented as a 3x4 matrix.

2. What are the different types of matrices?

There are several types of matrices, including square matrices, rectangular matrices, symmetric matrices, identity matrices, and diagonal matrices. Square matrices have an equal number of rows and columns, rectangular matrices have different numbers of rows and columns, symmetric matrices have equal values across the main diagonal, identity matrices have 1s across the main diagonal and 0s everywhere else, and diagonal matrices have non-zero values only across the main diagonal.

3. How can I perform basic operations on matrices?

To perform basic operations on matrices, such as addition, subtraction, and multiplication, you need to make sure that the matrices have compatible dimensions. For example, to add two matrices, they must have the same number of rows and columns. To multiply two matrices, the number of columns in the first matrix must match the number of rows in the second matrix.

4. How do I determine the type of matrix through analysis?

To determine the type of matrix, you can analyze its properties, such as dimensions, symmetry, and values. For example, a square matrix with equal values across the main diagonal is an identity matrix. A matrix with non-zero values only across the main diagonal is a diagonal matrix. By examining these properties, you can determine the type of matrix.

5. What is the importance of matrix analysis and classification?

Matrix analysis and classification are important in various fields of science and mathematics, including statistics, physics, computer science, and engineering. Matrices are used to represent and solve systems of equations, perform transformations, and analyze data. By understanding the properties and types of matrices, scientists can effectively use them in their research and calculations.

Similar threads

Replies
4
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
18
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
Replies
4
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
926
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Differential Equations
Replies
2
Views
2K
Replies
34
Views
2K
Back
Top