Coordinate transformation matrix?

Click For Summary
The discussion focuses on understanding the definition of an orthogonal transformation matrix, specifically the relationship A(i,j)A(k,j)=q(i,k), where q is the Kronecker delta. It explains that this relationship indicates that the product of the matrix with its transpose results in the identity matrix, confirming that the inverse of an orthogonal matrix is equal to its transpose. Participants clarify the notation used in matrix multiplication and encourage working through examples to grasp the concepts better. The conversation highlights the importance of understanding matrix entries and their relationships in transformations. Overall, it emphasizes the foundational principles of orthogonal matrices and their properties.
Will_C
Can anyone tell me:
1) How to understand the defination to orthogonal transformation matrix?
Defination: A(i,j)A(k,j)=q(i,k) where q is Kronecker delta.
2) Why the inverse of this orthogonal matrix is equal to its transpose?

Will.
 
Physics news on Phys.org
In short if we want to find the pq'th entry of a product of matrices then

(AB)_{p,q}= \sum_r A_{p,r}B_{r,q}:=A_{p,r}B_{r,q}

the convention being that when ever we see a repeated index we sum.

What does A(i,j)A(k,j) mean? well the (k,j)th entry of A is the jk'th entry in A transpose, so what you've written is the same as (AA^t)(i,k) and states that

"the ik'th entry of AA^t is 1 if i=k, and zero otherwise"#

which is exactly what it means to be the identity matrix.

Thus 1 and 2 are exactly the same thing.
 
Excuse me, matt grime,
I am not quite understand what you mentioned above.
Would you mind make it simply or explain it more?
BTW, I don't know how to input math symbol (such as summation sign, subscript...) in the thread.

Thx,
Will.
 
Let's try and see where the problem is:

Have you met the notation that

A_{i,j}

is the entry in row i column j of a matrix?

cick on the maths to see how to typeset it.

Did you try and work through some small examples, such as 2x2 matrices to see how this notation does indeed show how they multiply together?

If A and B are 2x2 matrices then, as we all know,

(AB)_{1,1} = A_{1,1}B_{1,1} + A_{1,2}B_{2,1}

which is exactly what I wrote with the summation sign. You've done summation signs right?


Then entry in row i column j of A^t is the same as the entry in row j column i of A.

Do you see that?
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K