Similarity transformation, basis change and orthogonality

  • #1
Azad Koshur
7
0
I've a transformation ##T## represented by an orthogonal matrix ##A## , so ##A^TA=I##. This transformation leaves norm unchanged.

I do a basis change using a matrix ##B## which isn't orthogonal , then the form of the transformation changes to ##B^{-1}AB## in the new basis( A similarity transformation).

Since we only changed our representation of the transformation ##T## then transformation ##B^{-1}AB## should also leave norm unchanged which means that ##B^{-1}AB## should be orthogonal.

Therefore ##B^{-1}AB.{{[B^{-1}AB}}]^T=I##.

This suggests that ##B^TB=I## which means it is orthogonal, but that is a contradiction.

Can anyone tell me if what I did wrong.
Thank you.
 

Answers and Replies

  • #2
martinbn
Science Advisor
3,058
1,393
I've a transformation ##T## represented by an orthogonal matrix ##A## , so ##A^TA=I##. This transformation leaves norm unchanged.

I do a basis change using a matrix ##B## which isn't orthogonal , then the form of the transformation changes to ##B^{-1}AB## in the new basis( A similarity transformation).

Since we only changed our representation of the transformation ##T## then transformation ##B^{-1}AB## should also leave norm unchanged which means that ##B^{-1}AB## should be orthogonal.

Therefore ##B^{-1}AB.{{[B^{-1}AB}}]^T=I##.

This suggests that ##B^TB=I## which means it is orthogonal, but that is a contradiction.

Can anyone tell me if what I did wrong.
Thank you.
You say it suggests ##B^TB=I##, but does it imply it?
 
  • #3
Azad Koshur
7
0
You say it suggests ##B^TB=I##, but does it imply it?
##B^TB=I## is a solution but I'm not sure it's the only one.
 
  • #4
hutchphd
Science Advisor
Homework Helper
2022 Award
5,382
4,540
For instance let ##\mathbb A=\mathbb 1##. Then there are no additional requirements upon ##\mathbb B## and your supposition is manifestly incorrect.
 
  • #5
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,784
15,399
##B^TB=I## is a solution but I'm not sure it's the only one.
Let's look at your equation:
$$B^{-1}AB(B^{-1}AB)^T = I \ \Leftrightarrow \ AB(B^TA^T(B^{-1})^T) = B$$$$\Leftrightarrow \ ABB^TA^T(B^T)^{-1} = B \ \Leftrightarrow \ ABB^TA^T = BB^T$$And we can see that if ##A## is orthogonal, then this equation holds whenever ##BB^T## is invariant under the transformation ##X \ \rightarrow \ AXA^T##.

So, perhaps your assumption that ##B^{-1}AB## is orthogonal is false? Did you try to prove it?
 
  • #6
Azad Koshur
7
0
Let's look at your equation:
$$B^{-1}AB(B^{-1}AB)^T = I \ \Leftrightarrow \ AB(B^TA^T(B^{-1})^T) = B$$$$\Leftrightarrow \ ABB^TA^T(B^T)^{-1} = B \ \Leftrightarrow \ ABB^TA^T = BB^T$$And we can see that if ##A## is orthogonal, then this equation holds whenever ##BB^T## is invariant under the transformation ##X \ \rightarrow \ AXA^T##.

So, perhaps your assumption that ##B^{-1}AB## is orthogonal is false? Did you try to prove it?
Wikipedia says :
"In linear algebra, two n-by-n matrices A and B are called similar if there exists an invertible n-by-n matrix P such that

##{\displaystyle B=P^{-1}AP.}##
{\displaystyle B=P^{-1}AP.}

Similar matrices represent the same linear map under two (possibly) different bases, with P being the change of basis matrix"
In our case A was the transformation and ##B^{-1}AB## was the transformation in another basis. Since the underlying transformation preserves norm then ##B^{-1}AB## has to persevere it as well. But all norm preserving matrices are orthogonal so ##B^{-1}AB## has to be as well.
Where does this argument go wrong?

Link:https://en.m.wikipedia.org/wiki/Matrix_similarity
 
  • #7
Azad Koshur
7
0
For instance let ##\mathbb A=\mathbb 1##. Then there are no additional requirements upon ##\mathbb B## and your supposition is manifestly incorrect.
But I don't see where the argument went wrong?
 
  • #8
hutchphd
Science Advisor
Homework Helper
2022 Award
5,382
4,540
You have assumed the matrix B to not be orthogonal but then show that it can be orthogonal. This is not the same as showing it must be orthogonal and does not serve as negation.
 
  • #9
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,784
15,399
Wikipedia says :
"In linear algebra, two n-by-n matrices A and B are called similar if there exists an invertible n-by-n matrix P such that

##{\displaystyle B=P^{-1}AP.}##
{\displaystyle B=P^{-1}AP.}

Similar matrices represent the same linear map under two (possibly) different bases, with P being the change of basis matrix"
In our case A was the transformation and ##B^{-1}AB## was the transformation in another basis. Since the underlying transformation preserves norm then ##B^{-1}AB## has to persevere it as well. But all norm preserving matrices are orthogonal so ##B^{-1}AB## has to be as well.
Where does this argument go wrong?

Link:https://en.m.wikipedia.org/wiki/Matrix_similarity
An orthogonal transformation may not be represented by an orthogonal matrix in a general basis - although always in an orthonormal basis. Your transformation may be norm-preserving, but the matrix representation in a general basis may not be orthogonal.
 
  • Like
Likes Azad Koshur and hutchphd
  • #11
wrobel
Science Advisor
Insights Author
997
859
Since we only changed our representation of the transformation then transformation should also leave norm unchanged
yes but how is this norm presented in this new basis?

In general if ##G## is a matrix of the inner product then the orthogonality of a matrix ##A## means that ##A^TGA=I##
 
  • Like
Likes Azad Koshur
  • #12
Azad Koshur
7
0
An orthogonal transformation may not be represented by an orthogonal matrix in a general basis - although always in an orthonormal basis. Your transformation may be norm-preserving, but the matrix representation in a general basis may not be orthogonal.

An orthogonal transformation may not be represented by an orthogonal matrix in a general basis - although always in an orthonormal basis. Your transformation may be norm-preserving, but the matrix representation in a general basis may not be orthogonal.
Yes got it now.
 
  • #13
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
5,475
1,417
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
 
  • #14
Azad Koshur
7
0
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
I think represent those into orthogonal basis coordinates and then calculate the norm using the usual way
 
  • #15
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
5,475
1,417
I think represent those into orthogonal basis coordinates and then calculate the norm using the usual way

That's right. So when you say a matrix is orthogonal, you are making a claim about the rows and columns being orthogonal (notice matrix multiplication of the matrix with its transpose is doing the dot product of the rows and the columns). But if you have the matrix represented in the new basis, you would first have to transform the matrix back to being in the old basis before computing all those dot products to decide if it's orthogonal.
 
  • #16
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,784
15,399
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
Using the inner product of the new basis vectors, of course!
 
  • #17
Kashmir
437
72
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
Here ##x_i## is the coefficient of the new basis vector ##e'_i##?
 
  • #18
Kashmir
437
72
Using the inner product of the new basis vectors, of course!
We can't use ##x^T x## as the definition of norm here I think.
 
  • #19
Azad Koshur
7
0
yes but how is this norm presented in this new basis?

In general if ##G## is a matrix of the inner product then the orthogonality of a matrix ##A## means that ##A^TGA=I##
Yes I was thinking the norm will still be given by the same way i did in the old basis. In this new basis I would convert the vectors back to standard basis by a matrix ##P##, then the norm will be just ##x^Tx## (considering real elements only).
 
  • #20
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
5,475
1,417
Here ##x_i## is the coefficient of the new basis vector ##e'_i##?

That's correct

We can't use ##x^T x## as the definition of norm here I think.

This is also right. You have to transform the coordinates to a basis where the dot product is valid.
 
  • #21
WWGD
Science Advisor
Gold Member
6,292
8,186
As I understand, Ortho matrices do not just preserve the norm , but preserve the inner-product. Edit: meaning : <x,y>=<Tx, Ty> , for T orthogonal and <,> an inner product. Note that angles between vectors are also preserved.
 
Last edited:

Suggested for: Similarity transformation, basis change and orthogonality

Replies
12
Views
2K
Replies
12
Views
162
Replies
52
Views
1K
Replies
1
Views
2K
Replies
43
Views
3K
Replies
7
Views
295
Replies
8
Views
2K
Replies
3
Views
2K
  • Last Post
Replies
25
Views
915
Top