High School Is identity matrix basis dependent?

Click For Summary
SUMMARY

The discussion clarifies that only the matrix ##\begin{pmatrix}1&0\\0&1\end{pmatrix}## is an identity matrix, while ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}## is a rotation matrix. The identity matrix represents the identity linear transformation across all bases, while the second matrix's representation changes with a change of basis. The conversation emphasizes the importance of understanding linear transformations and the role of basis in matrix representation.

PREREQUISITES
  • Understanding of linear transformations in vector spaces
  • Familiarity with 2x2 matrices and their properties
  • Knowledge of basis change and its effects on matrix representation
  • Concept of isomorphism in algebraic structures
NEXT STEPS
  • Study the properties of linear transformations in vector spaces
  • Learn about matrix representations of linear transformations
  • Explore the concept of basis change and its implications in linear algebra
  • Investigate the role of isomorphism in algebraic structures, particularly in rings
USEFUL FOR

Mathematicians, physics students, and anyone interested in linear algebra, particularly those studying transformations and matrix theory.

zonde
Gold Member
Messages
2,960
Reaction score
224
To me it seems basic question or even obvious but as I am not mathematician I would rather like to check.
Is it true that these two matrices are both identity matrices: ##\begin{pmatrix}1&0\\0&1\end{pmatrix} ## and ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}##? It's just that one of them would become diagonal with the right change of basis and the diagonal one would not be diagonal any more with that change of basis, right?
 
Physics news on Phys.org
No. Only the first is an identity matrix. The second is a rotation matrix.
 
As soon as you write a matrix and interpret it as a linear function, you already have fixed a basis! Two, to be exact.
 
  • Like
Likes FactChecker
zonde said:
To me it seems basic question or even obvious but as I am not mathematician I would rather like to check.
Is it true that these two matrices are both identity matrices: ##\begin{pmatrix}1&0\\0&1\end{pmatrix} ## and ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}##? It's just that one of them would become diagonal with the right change of basis and the diagonal one would not be diagonal any more with that change of basis, right?

Your question is based on a fundamental misunderstanding. The set of 2x2 matrices is a well-defined ring. It has a multiplicative identity element, which is:

##\begin{pmatrix}1&0\\0&1\end{pmatrix} ##

Now, the set of linear transformations of ##\mathbb{R}^2## is also a ring. And, is isomorphic to the ring of 2x2 matrices. Given any basis for ##\mathbb{R}^2##, every linear transformation maps to a specific 2x2 matrix. The mapping depends on the basis. But (as with all ring isomorphisms) the identity linear transformation must map to the identity matrix. In other words, the identity linear transformation is always represented by the identity matrix.
 
  • Like
Likes WWGD, FactChecker, Igael and 1 other person
Thanks Orodruin, fresh_42 and PeroK for your answers.
I suppose my misunderstanding comes from thinking about matrices as a kind of set of vectors. For now I will try to absorb what you said.
 
In order for an arbitrary matrix A to be an identity matrix we must have the condition that for all vectors x

Ax = xA = x.

The first is clearly a 2x2 identity matrix. the second isn't since Ax neq x
 
Your equations do not make any sense whatsoever if x is not a square matrix of the same dimension as A.
 
Orodruin said:
Your equations do not make any sense whatsoever if x is not a square matrix of the same dimension as A.

True I should have specified that x must be of correct dimension such that multiplication by A is possible in which x would have to be a square 2x2 matrix in order for the commutative law to be applicable as show above. i.e.if A is a identity matrix then

Ax = xA = x iff dimensions of A= dimensions of x.

Thanks for adding the clarification regarding the dimensions of x.
 
"... the diagonal one would not be diagonal any more with that change of basis, right?"

Interesting question raised here.

As you may know: Say that for some vector space V with dim(V) = n has a specified basis A = {a1, ..., an}. And suppose we have a linear transformation

L: V → V​

that is represented by a square n x n matrix M acting on column vectors placed to its right. So that the kth column of M, which is the transpose of (mk1, ..., mkn), represents the vector

L(ak) = mk1 a1 + ... + mkn an

(Note that it is necessary to specify these things before we can make sense of questions about matrices, linear transformations, and change-of-basis matrices.)

Now suppose B = {b1, ..., bn} is another basis for V, and that we would now like to re-express the same linear transformation L in terms of this new basis.

Question 1: How do we do this? Answer: First we need to write each basis vector of the new basis B as a linear combination of the basis vectors of the old basis A. There is always some way to do this. That will give n linear coefficients of the A vectors for each one of the n B vectors, and these need to be arranged in a matrix: Call it CAB. (Sub-question: Exactly how should these n2 numbers be arranged?) Now assume CAB-1 is the inverse of the matrix CAB.

Then with respect to the new basis B, the linear transformation L can be expressed as the product of the three matrices:

CAB-1 M CAB

in that order. Now — if the matrix CAB is arranged correctly, we can apply this matrix to a column vector denoting an element v of V expressed with respect to the new basis B, and the result will be the column vector L(v) that is also expressed in terms of the new basis B.

Question 2: Suppose D is any diagonal n x n matrix such that there exists some n x n invertible matrix C with the property that

C-1 D C​

is not diagonal. What can be said about the matrix D ?

(Perhaps it's easier to think of the complementary question: Suppose D has the complementary property that for every invertible n x n matrix C, we have that

C-1 D C​

is again a diagonal matrix. For which diagonal matrices D is this true for?)
 
  • #10
zonde said:
To me it seems basic question or even obvious but as I am not mathematician I would rather like to check.
Is it true that these two matrices are both identity matrices: ##\begin{pmatrix}1&0\\0&1\end{pmatrix} ## and ##\begin{pmatrix}\frac{1}{\sqrt2}&-\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}&\frac{1}{\sqrt2}\end{pmatrix}##? It's just that one of them would become diagonal with the right change of basis and the diagonal one would not be diagonal any more with that change of basis, right?
If you think of the matrix as two row vectors written out in terms of a basis then rewriting the rows of the second matrix in terms of the two vectors in the second matrix would give you the identity back again. The first matrix, though, would no longer be the identity matrix.

If you think of a matrix as describing a linear transformation , then the identity transformation will be represented by the identity matrix in every basis:this because the identity is the identity on every vector.
 
  • #11
lavinia said:
If you think of the matrix as two row vectors written out in terms of a basis then rewriting the rows of the second matrix in terms of the two vectors in the second matrix would give you the identity back again. The first matrix, though, would no longer be the identity matrix.

If you think of a matrix as describing a linear transformation , then the identity transformation will be represented by the identity matrix in every basis:this because the identity is the identity on every vector.
Thanks for your answer. I got the idea that taking these matrices as transformation matrices there is only one identity matrix. I have a feeling that in context where I came across them, they where not exactly transformation matrices (it's density matrices in quantum mechanics). Rather they describe averages of squares of eigenvectors over a set of complex vectors. Well, something like that.
 
  • #12
Consider a change of basis of the Id (in the basis, representation you are using, or at least, assume that the ID matrix you use is described by the matrix you used), into a matrix I', through the use of a matrix B. Then:

## I'=BIB^{-1} =IBB^{-1}=I ##

I hope I did not misunderstand your question.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 52 ·
2
Replies
52
Views
4K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
31
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K