- #1
tom08
- 19
- 0
Homework Statement
I encounter a strange problem.
Let A= [1.0000 0 0
0 0 0
0 0 0.4472
0 0.3162 0
0 0.9487 0
0 0 0.8944]
I am surprise to find that A'*A=I, but A*A'<>I . Can anyone give me an explanation?
tom08 said:Thanks for ur kind reply. but must A and B be square such that the following equation holds ?
(A'B)'=B'*A
I think that if A*A'=I (wheter is square or not), we take transpose operator on both sides of the equation, and obtain that
(A*A')' = I'
then
A*A' = I
When A is multiplied by its transpose A', the resulting matrix contains the dot products of the rows of A with the columns of A'. This means that the diagonal elements of the resulting matrix will be the squared magnitudes of the rows of A, which is equivalent to the diagonal elements of the identity matrix I. Therefore, A'*A=I.
When A is multiplied by its transpose A', the resulting matrix contains the dot products of the rows of A with the columns of A'. Since A is multiplied by its transpose, the resulting matrix will be square and symmetric. The diagonal elements of this matrix will be the squared magnitudes of the columns of A, which is equivalent to the diagonal elements of the identity matrix I. Therefore, A*A'=I.
No, A'*A is only equal to I if A is a square matrix with orthogonal columns. If A is not a square matrix or if its columns are not orthogonal, then A'*A will not equal I.
Yes, A*A' can equal a matrix other than I if A is not a square matrix or if its columns are not orthogonal. In this case, the resulting matrix will not have all diagonal elements equal to 1, making it not equal to I.
A'*A=I and A*A'=I are both examples of orthogonal matrices. This means that the columns of A are perpendicular to each other, and when multiplied by their transpose, they result in an identity matrix. Orthogonal matrices have many important applications in mathematics, physics, and engineering, including in the fields of linear algebra, geometry, and signal processing.