Proving a matrix is orthogonal

  • Thread starter Thread starter member 428835
  • Start date Start date
  • Tags Tags
    Matrix Orthogonal
member 428835

Homework Statement


Show that the matrix ##P = \big{[} p_{ij} \big{]}## is orthogonal.

Homework Equations


##P \vec{v} = \vec{v}'## where each vector is in ##\mathbb{R}^3## and ##P## is a ##3 \times 3## matrix. SO I guess ##P## is a transformation matrix taking ##\vec{v}## to ##\vec{v}'##. I also know ##\vec{v} = v_i \hat{e}_i## where ##\hat{e}_i## is the ##i##th unit vector.

The Attempt at a Solution


Orthogonal implies ##P P^t = I##. ##P P^t## can be wrote in component form as ##p_{ij} p_{ji}##. I believe I want to show that ##p_{ij} p_{ji} = \delta_{ij}##. After this I'm not really sure how to proceed. Any ideas?
 
Physics news on Phys.org
joshmccraney said:
pijpji=δijp_{ij} p_{ji} = \delta_{ij}.
Hi josh:

The quoted equation is wrong. You want something similar in which you show the index over which you do the sum required in multiplying a row of P by column of Pt.

joshmccraney said:
I'm not really sure how to proceed.
From your problem statement, I am guessing that you were not given a particular matrix you had to show is orthogonal, but rather show a method you can use to show that any given orthogonal matrix is in fact orthogonal. If that is the case, I think your attempted solution (with the correction) is all you need.
Hope this helps,

Regards,
Buzz
 
Thanks for taking time to reply Buzz! But when you say

Buzz Bloom said:
Hi josh:
The quoted equation is wrong.
I don't think I wrote what you quoted? I didn't multiply ##\delta_{ij}## by ##p##. Perhaps you quoted me while I was editing? But I do agree what I wrote was wrong.

Buzz Bloom said:
You want something similar in which you show the index over which you do the sum required in multiplying a row of P by column of Pt.

Ok, so to demonstrate ##P## is orthogonal would we have to show ##p_{ki}p_{kj} = \delta_{ij}##?

And yea, come to think of it I do think ##P## is a general matrix.
 
joshmccraney said:

Homework Statement


Show that the matrix ##P = \big{[} p_{ij} \big{]}## is orthogonal.

Homework Equations


##P \vec{v} = \vec{v}'## where each vector is in ##\mathbb{R}^3## and ##P## is a ##3 \times 3## matrix. SO I guess ##P## is a transformation matrix taking ##\vec{v}## to ##\vec{v}'##. I also know ##\vec{v} = v_i \hat{e}_i## where ##\hat{e}_i## is the ##i##th unit vector.
I'm confused as to what is the actual problem. Did you put part of the problem statement in the relevant equations? If not, the problem statement, as written, is false.
An arbitrary matrix is not orthogonal.

Also, what does this mean -- ##\vec{v} = v_i \hat{e}_i##? In the context of vectors in ##\mathbb{R}^3##, it would make more sense to write ##\vec{v} = v_1 \hat{e}_1 + v_2 \hat{e}_2 + v_3\hat{e}_3##
joshmccraney said:

The Attempt at a Solution


Orthogonal implies ##P P^t = I##. ##P P^t## can be wrote in component form as ##p_{ij} p_{ji}##. I believe I want to show that ##p_{ij} p_{ji} = \delta_{ij}##. After this I'm not really sure how to proceed. Any ideas?
 
Mark44 said:
I'm confused as to what is the actual problem. Did you put part of the problem statement in the relevant equations?
Yes I did, I'm sorry about that!

Mark44 said:
Also, what does this mean -- ##\vec{v} = v_i \hat{e}_i##? In the context of vectors in ##\mathbb{R}^3##, it would make more sense to write ##\vec{v} = v_1 \hat{e}_1 + v_2 \hat{e}_2 + v_3\hat{e}_3##
I was using Einstein notation, so it means exactly the sum that you wrote in the end.
 
Mark44 said:
Also, what does this mean -- ##\vec{v} = v_i \hat{e}_i##? In the context of vectors in ##\mathbb{R}^3##, it would make more sense to write ##\vec{v} = v_1 \hat{e}_1 + v_2 \hat{e}_2 + v_3\hat{e}_3##
joshmccraney said:
I was using Einstein notation, so it means exactly the sum that you wrote in the end.
Wouldn't the right side be shown in brackets, like this?
##[ v_i \hat{e}_i]##
This is similar to the shorthand notation ##[p_{ij}]## that you used in the OP to represent all of the entries of matrix P.

In any case, what is the exact problem statement? From what you've provided so far, I don't see how one can show that an arbitrary matrix is orthogonal.
 
Mark44 said:
I'm confused as to what is the actual problem.
This was also given, but I didn't include it because it seemed like it was of no help:

If a vector ##\vec{v}## has coordinates ##v_i## with respect to a basis ##\vec{e_i}## , the transformation rule will tell us the coordinates of the same vector ##\vec{v}## with respect to a different basis ##\vec{e_i}'## . Let ##v_i'## denote the coordinates of ##\vec{v}## with respect to ##\vec{e_i}'##. Our goal is to find the transformation rule governing ##v_i'## and ##v_i##.

Since ##\vec{e_i}## is a basis, it is possible to find a unique set of 9 numbers, ##p_{ij}## such that ##\vec{e_i}' = p_{ij}\vec{e_j}##.
 
Mark44 said:
In any case, what is the exact problem statement? From what you've provided so far, I don't see how one can show that an arbitrary matrix is orthogonal.
I have posted the notes that correspond to ##P##.
 
From post #1:
joshmccraney said:
Orthogonal implies ##P P^t = I##.
This also implies that ##P^{-1} = P^t##.

Maybe I'm missing something, but I don't see anything in the problem description that would lead me to believe that the matrix is orthogonal. You have Pv = v', but you don't show anything about v', other than it is a vector in R3.
 
  • #10
Mark44 said:
From post #1: This also implies that ##P^{-1} = P^t##.

Maybe I'm missing something, but I don't see anything in the problem description that would lead me to believe that the matrix is orthogonal. You have Pv = v', but you don't show anything about v', other than it is a vector in R3.
I totally agree. To me it looks as though we are given a square ##3 \times 3## matrix and asked to show this property is true. I'll ask the professor about it, I just wanted to see if anyone else picked up on something I did not. Thanks for your help Mark44!
 
Back
Top