# Similarity transformation, basis change and orthogonality

• I
In summary: However, in an arbitrary basis, the matrix representation of an orthogonal transformation may not be orthogonal. Therefore, while your transformation may be norm-preserving, the matrix representation in a general basis may not be orthogonal and thus your argument falls apart.
I've a transformation ##T## represented by an orthogonal matrix ##A## , so ##A^TA=I##. This transformation leaves norm unchanged.

I do a basis change using a matrix ##B## which isn't orthogonal , then the form of the transformation changes to ##B^{-1}AB## in the new basis( A similarity transformation).

Since we only changed our representation of the transformation ##T## then transformation ##B^{-1}AB## should also leave norm unchanged which means that ##B^{-1}AB## should be orthogonal.

Therefore ##B^{-1}AB.{{[B^{-1}AB}}]^T=I##.

This suggests that ##B^TB=I## which means it is orthogonal, but that is a contradiction.

Can anyone tell me if what I did wrong.
Thank you.

I've a transformation ##T## represented by an orthogonal matrix ##A## , so ##A^TA=I##. This transformation leaves norm unchanged.

I do a basis change using a matrix ##B## which isn't orthogonal , then the form of the transformation changes to ##B^{-1}AB## in the new basis( A similarity transformation).

Since we only changed our representation of the transformation ##T## then transformation ##B^{-1}AB## should also leave norm unchanged which means that ##B^{-1}AB## should be orthogonal.

Therefore ##B^{-1}AB.{{[B^{-1}AB}}]^T=I##.

This suggests that ##B^TB=I## which means it is orthogonal, but that is a contradiction.

Can anyone tell me if what I did wrong.
Thank you.
You say it suggests ##B^TB=I##, but does it imply it?

martinbn said:
You say it suggests ##B^TB=I##, but does it imply it?
##B^TB=I## is a solution but I'm not sure it's the only one.

For instance let ##\mathbb A=\mathbb 1##. Then there are no additional requirements upon ##\mathbb B## and your supposition is manifestly incorrect.

##B^TB=I## is a solution but I'm not sure it's the only one.
$$B^{-1}AB(B^{-1}AB)^T = I \ \Leftrightarrow \ AB(B^TA^T(B^{-1})^T) = B$$$$\Leftrightarrow \ ABB^TA^T(B^T)^{-1} = B \ \Leftrightarrow \ ABB^TA^T = BB^T$$And we can see that if ##A## is orthogonal, then this equation holds whenever ##BB^T## is invariant under the transformation ##X \ \rightarrow \ AXA^T##.

So, perhaps your assumption that ##B^{-1}AB## is orthogonal is false? Did you try to prove it?

PeroK said:
$$B^{-1}AB(B^{-1}AB)^T = I \ \Leftrightarrow \ AB(B^TA^T(B^{-1})^T) = B$$$$\Leftrightarrow \ ABB^TA^T(B^T)^{-1} = B \ \Leftrightarrow \ ABB^TA^T = BB^T$$And we can see that if ##A## is orthogonal, then this equation holds whenever ##BB^T## is invariant under the transformation ##X \ \rightarrow \ AXA^T##.

So, perhaps your assumption that ##B^{-1}AB## is orthogonal is false? Did you try to prove it?
Wikipedia says :
"In linear algebra, two n-by-n matrices A and B are called similar if there exists an invertible n-by-n matrix P such that

##{\displaystyle B=P^{-1}AP.}##

Similar matrices represent the same linear map under two (possibly) different bases, with P being the change of basis matrix"
In our case A was the transformation and ##B^{-1}AB## was the transformation in another basis. Since the underlying transformation preserves norm then ##B^{-1}AB## has to persevere it as well. But all norm preserving matrices are orthogonal so ##B^{-1}AB## has to be as well.
Where does this argument go wrong?

hutchphd said:
For instance let ##\mathbb A=\mathbb 1##. Then there are no additional requirements upon ##\mathbb B## and your supposition is manifestly incorrect.
But I don't see where the argument went wrong?

You have assumed the matrix B to not be orthogonal but then show that it can be orthogonal. This is not the same as showing it must be orthogonal and does not serve as negation.

Wikipedia says :
"In linear algebra, two n-by-n matrices A and B are called similar if there exists an invertible n-by-n matrix P such that

##{\displaystyle B=P^{-1}AP.}##

Similar matrices represent the same linear map under two (possibly) different bases, with P being the change of basis matrix"
In our case A was the transformation and ##B^{-1}AB## was the transformation in another basis. Since the underlying transformation preserves norm then ##B^{-1}AB## has to persevere it as well. But all norm preserving matrices are orthogonal so ##B^{-1}AB## has to be as well.
Where does this argument go wrong?

An orthogonal transformation may not be represented by an orthogonal matrix in a general basis - although always in an orthonormal basis. Your transformation may be norm-preserving, but the matrix representation in a general basis may not be orthogonal.

Since we only changed our representation of the transformation then transformation should also leave norm unchanged
yes but how is this norm presented in this new basis?

In general if ##G## is a matrix of the inner product then the orthogonality of a matrix ##A## means that ##A^TGA=I##

PeroK said:
An orthogonal transformation may not be represented by an orthogonal matrix in a general basis - although always in an orthonormal basis. Your transformation may be norm-preserving, but the matrix representation in a general basis may not be orthogonal.

PeroK said:
An orthogonal transformation may not be represented by an orthogonal matrix in a general basis - although always in an orthonormal basis. Your transformation may be norm-preserving, but the matrix representation in a general basis may not be orthogonal.
Yes got it now.

I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?

Office_Shredder said:
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
I think represent those into orthogonal basis coordinates and then calculate the norm using the usual way

I think represent those into orthogonal basis coordinates and then calculate the norm using the usual way

That's right. So when you say a matrix is orthogonal, you are making a claim about the rows and columns being orthogonal (notice matrix multiplication of the matrix with its transpose is doing the dot product of the rows and the columns). But if you have the matrix represented in the new basis, you would first have to transform the matrix back to being in the old basis before computing all those dot products to decide if it's orthogonal.

Office_Shredder said:
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
Using the inner product of the new basis vectors, of course!

Office_Shredder said:
I think the really interesting question here is, given two vectors ##x=(x_1,...,x_n)## and ##y=(y_1,...,y_n)## in the new basis representation, how do you decide if they are orthogonal?
Here ##x_i## is the coefficient of the new basis vector ##e'_i##?

PeroK said:
Using the inner product of the new basis vectors, of course!
We can't use ##x^T x## as the definition of norm here I think.

wrobel said:
yes but how is this norm presented in this new basis?

In general if ##G## is a matrix of the inner product then the orthogonality of a matrix ##A## means that ##A^TGA=I##
Yes I was thinking the norm will still be given by the same way i did in the old basis. In this new basis I would convert the vectors back to standard basis by a matrix ##P##, then the norm will be just ##x^Tx## (considering real elements only).

Kashmir said:
Here ##x_i## is the coefficient of the new basis vector ##e'_i##?

That's correct

Kashmir said:
We can't use ##x^T x## as the definition of norm here I think.

This is also right. You have to transform the coordinates to a basis where the dot product is valid.

As I understand, Ortho matrices do not just preserve the norm , but preserve the inner-product. Edit: meaning : <x,y>=<Tx, Ty> , for T orthogonal and <,> an inner product. Note that angles between vectors are also preserved.

Last edited:

## 1. What is a similarity transformation?

A similarity transformation is a type of transformation that preserves the shape and size of an object, but may change its orientation or position. It is commonly used in linear algebra to transform a matrix into a similar matrix with different basis vectors.

## 2. How is a basis change different from a similarity transformation?

A basis change is a type of similarity transformation that involves changing the basis vectors of a vector space. It is used to simplify calculations and solve problems in linear algebra. While a similarity transformation can involve changing the basis vectors, it can also involve other types of transformations such as scaling or rotation.

## 3. What is the significance of orthogonality in similarity transformations?

Orthogonality is an important concept in similarity transformations as it ensures that the basis vectors used in the transformation are perpendicular to each other. This is important because it allows for a simpler and more efficient calculation of the transformation, as well as preserving the shape and size of the object being transformed.

## 4. How are similarity transformations used in real-world applications?

Similarity transformations are used in a variety of fields, including computer graphics, physics, and engineering. In computer graphics, they are used to rotate, scale, and translate objects on a screen. In physics, they are used to transform vectors and matrices to simplify calculations. In engineering, they are used to analyze and solve problems involving transformations of objects or systems.

## 5. Can similarity transformations be applied to non-linear objects?

Yes, similarity transformations can be applied to non-linear objects. However, the transformation may not preserve the shape and size of the object in the same way as it would for a linear object. Non-linear objects may also require more complex transformations, such as affine transformations, to accurately represent their shape and size.

• Linear and Abstract Algebra
Replies
9
Views
333
• Linear and Abstract Algebra
Replies
1
Views
521
• Linear and Abstract Algebra
Replies
12
Views
3K
• Linear and Abstract Algebra
Replies
9
Views
675
• Linear and Abstract Algebra
Replies
20
Views
1K
• Linear and Abstract Algebra
Replies
3
Views
2K
• Linear and Abstract Algebra
Replies
5
Views
1K
• Linear and Abstract Algebra
Replies
52
Views
2K
• Linear and Abstract Algebra
Replies
2
Views
810
• Linear and Abstract Algebra
Replies
7
Views
832