Linear Algebra - Find an orthogonal matrix P

In summary, the conversation was about solving a problem for a Linear Algebra course using Mathematica. The individual checked that the matrix A was symmetric and computed its eigenvalues using the characteristic polynomial. They then used Mathematica to solve the determinant and find the eigenvalues. Next, they computed the eigenvectors and used the built-in command in Mathematica to check their solution. They also used the Gram-Schmidt orthogonalization process to find the vectors and normalized them. Finally, they attempted to compute the diagonal matrix but encountered an error due to using MatrixForm. After removing MatrixForm, the problem was solved.
  • #1
Lelouch
18
0
A problem that I have to solve for my Linear Algebra course is the following

1.png

We are supposed to use Mathematica.

What I have done is that I first checked that A is symmetric, i.e. that ##A = A^T##. Which is obvious.

Next I computed the eigenvalues for A. The characteristic polynomial is given by ## det(A - \lambda*I) = \lambda(4+\lambda)^3 ##. Solving ## \lambda(4+\lambda)^3 = 0 ##, then yields the eigenvalues ## \lambda_1 = 0 ## and ## \lambda_2 = -4 ##. I used Mathematica to solve the determinant instead of doing it by hand. I also know that one could have used the Eigenvalues[] command in Mathematica. However, in order to report the step by step procedure in my report later I decided not to use the build in commands for Mathematica.

Next, I computed the eigenvectors for each eigenvalue. I.e. I computed ## (A - \lambda_i*I)* \vec x = \vec 0 ## for ## i = 1, 2 ##. I used mathematica to solve the matrices which yields the following eigenvectors.
For ## \lambda_1 = 0 ## we have ## \vec v_1 = (1, 1, 1, 1) ##.
For ## \lambda_2 = -4 ## we have ## \vec v_2 = (-1, 1, 0, 0), \vec v_3 = (-1, 0, 1, 0), \vec v_4 = (-1, 0, 0, 1)##.
I also checked the built in command Eigenvectors[] from Mathematica and it yields the same solution.

Next what I suppose had to be done is to use the Gram-Schmidt orthogonalization on ## \vec v_2, \vec v_3, \vec v_4 ##. This one I did not do by hand. Instead I used the command Orthogonalize[]. Which gave the vectors ## \vec w_2 = (-\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}, 0, 0), \vec w_3 = (-\frac{1}{\sqrt{6}}, -\frac{1}{\sqrt{6}}, \sqrt{\frac{2}{3}}, 0), \vec w_4 = (-\frac{1}{2\sqrt{3}}, -\frac{1}{2\sqrt{3}},
-\frac{1}{2\sqrt{3}}, \frac{\sqrt{3}}{2}) .##
I also normalized the vector ## \vec v_1 = (1, 1, 1, 1) ## which is then ## \vec w_1 = (\frac{1}{2}, \frac{1}{2}, \frac{1}{2}, \frac{1}{2}) ##.

Then, I formed the matrix ##P## such that
##P =
\begin{pmatrix}
\frac{1}{2} & -\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{6}} & -\frac{1}{2\sqrt{3}} \\
\frac{1}{2} & \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{6}} & -\frac{1}{2\sqrt{3}} \\
\frac{1}{2} & 0 & \sqrt{\frac{2}{3}} & -\frac{1}{2\sqrt{3}} \\
\frac{1}{2} & 0 & 0 & \frac{\sqrt{3}}{2}
\end{pmatrix}
##.
After that I checked if ## P ## is an orthogonal matrix; i.e. ## P^T = P^{-1} ##. Which indeed turns out to be the case and also the Mathematica command OrthogonalMatrixQ[] gives true back.
2.png


Lastly, I wanted to compute the diagonal matrix ## D ## by ##P^T * A * P = D##. But this turns out rather strange in Mathematica:

3.png


I am not sure what to make of this last result. I suppose I have made a mistake somewhere during the Gram-Schmidt Orthogonal Process. However, I cannot seem to find the mistake, if there is one. Also note that this is my first day working with Mathematica and the Gram-Schmidt Process.
 

Attachments

  • 1.png
    1.png
    5.8 KB · Views: 1,245
  • 3.png
    3.png
    28.2 KB · Views: 621
  • 2.png
    2.png
    14.9 KB · Views: 654
Last edited:
Physics news on Phys.org
  • #2
You want to use . to multiply matrices in Mathematica, not * Try that.
 
  • Like
Likes Lelouch
  • #3
phyzguy said:
You want to use . to multiply matrices in Mathematica, not * Try that.

I just did that and it gives the following result:

4.png
 

Attachments

  • 4.png
    4.png
    10.1 KB · Views: 494
  • #4
Oh my... I just used the following instead of ##PTranspose##

5.png


It seems correct, since the eigenvalues are on the main diagonal. However, why can't I define/use the name PTranspose to equal the transpose of P?
 

Attachments

  • 5.png
    5.png
    2.7 KB · Views: 829
  • #5
@Lelouch, in future posts, please don't delete the homework template. Its use is required.
 
  • Like
Likes Lelouch
  • #6
Lelouch said:
Oh my... I just used the following instead of ##PTranspose##

View attachment 220829

It seems correct, since the eigenvalues are on the main diagonal. However, why can't I define/use the name PTranspose to equal the transpose of P?

The problem is that you used MatrixForm when you defined PTranspose. MatrixForm converts it to a different form which looks nice, but which isn't the same as a matrix. If you eliminate the MatrixForm in your definition of PTranspose, it will work.
 
  • Like
Likes Lelouch
  • #7
Mark44 said:
@Lelouch, in future posts, please don't delete the homework template. Its use is required.

I apologize for this. It was like 3 am for me when I wrote this thread and I was tired and was banging my head against a wall with this problem. I was re-doing the theory, the examples in the book by hand and proofs and was mad that I couldn't see where the mistake was. I was even more mad when I noticed that it was such a silly mistake.

phyzguy said:
The problem is that you used MatrixForm when you defined PTranspose. MatrixForm converts it to a different form which looks nice, but which isn't the same as a matrix. If you eliminate the MatrixForm in your definition of PTranspose, it will work.

Thank you. Also thanks for pointing out to use ##.## instead of ##*##.
 

1. What is an orthogonal matrix?

An orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors. In other words, the dot product of any two columns (or rows) is equal to 0, and the magnitude of each column (or row) is equal to 1.

2. How is an orthogonal matrix used in linear algebra?

An orthogonal matrix is used in linear algebra to perform transformations, such as rotations and reflections, without changing the length or orientation of vectors. It is also used in solving systems of linear equations and finding eigenvalues and eigenvectors.

3. How do you find an orthogonal matrix?

To find an orthogonal matrix, you can use the Gram-Schmidt process or the QR decomposition. The Gram-Schmidt process involves orthogonalizing a set of linearly independent vectors, while the QR decomposition involves decomposing a matrix into an orthogonal matrix and an upper triangular matrix.

4. What is the significance of an orthogonal matrix being square?

An orthogonal matrix being square means that it has an equal number of rows and columns. This is significant because it allows for the matrix to be invertible, which is necessary for performing certain operations in linear algebra, such as finding the inverse of a matrix.

5. Can you give an example of an orthogonal matrix?

One example of an orthogonal matrix is the rotation matrix in two dimensions. It has the form:[cosθ -sinθ][sinθ cosθ]where θ is the angle of rotation. This matrix is orthogonal because its columns and rows are orthogonal unit vectors.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
974
  • Calculus and Beyond Homework Help
Replies
5
Views
875
  • Calculus and Beyond Homework Help
Replies
6
Views
158
  • Calculus and Beyond Homework Help
Replies
2
Views
446
  • Calculus and Beyond Homework Help
Replies
21
Views
736
  • Calculus and Beyond Homework Help
Replies
4
Views
613
  • Calculus and Beyond Homework Help
Replies
20
Views
351
  • Calculus and Beyond Homework Help
Replies
2
Views
227
  • Calculus and Beyond Homework Help
Replies
2
Views
244
  • Calculus and Beyond Homework Help
Replies
5
Views
724
Back
Top