Proving Det(tA)=Det(A) for Unit Matrix

  • Thread starter Thread starter Enjoicube
  • Start date Start date
Click For Summary

Homework Help Overview

The discussion revolves around proving that the determinant of a unitary matrix is either +1 or -1, specifically focusing on real unitary matrices and their properties in relation to bilinear forms and transposes.

Discussion Character

  • Conceptual clarification, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants explore the relationship between the determinant of a unitary matrix and its transpose, questioning the assumptions regarding the nature of the transpose in different contexts.
  • Some participants discuss the implications of the determinant being equal to ±1 versus its absolute value being equal to 1.
  • There are inquiries about the validity of using the standard transpose in various inner product spaces and whether the results hold for complex unitary matrices.

Discussion Status

The conversation includes attempts to clarify the properties of real unitary matrices and their determinants, with some participants providing proofs and others offering conceptual insights. There is an ongoing exploration of the implications of different types of bilinear forms on the transpose operation.

Contextual Notes

Participants note the distinction between real and complex unitary matrices, with specific emphasis on the nature of the transpose and the assumptions made regarding bilinear forms. There is also mention of the potential limitations of the discussion based on the definitions of the inner products used.

Enjoicube
Messages
48
Reaction score
1

Homework Statement


Prove that the determinant of a unitary matrix is +/-1

Homework Equations


<Av,Aw>=<v,w>
Det(AB)=Det(A)*Det(B)

The Attempt at a Solution


Alright, I am aware that <Av,Aw>=<v,w> => A(tA)=I and (tA)A=I so Det(A(tA))=Det(I)=1 thus
Det(A)*Det(tA)=1. However, this is where I am stuck, I am aware that in some cases Det(tA)=Det(A), however this is not obvious in this case, as I cannot take tA to be the elementary definition of switching rows and columns. How can I prove that Det(tA)=Det(A) ONLY from the fact that <Aw,v>=<w,(tA)v>. This inner product can be absolutely any symmetric bilinear form, so do not assume I mean the dot product.
 
Last edited:
Physics news on Phys.org


I guess it should be |det(A)|=1, rather than det(A)=+/-1.
 


If you are trying to prove that the determinant of a unitary matrix is +/-1 then you MUST mean REAL unitary. It's not true for complex unitary. That means your tA is really just a plain old transpose.
 


Even for some random inner product, I think this is what I don't understand, why must it be the normal transpose for any completely random product, it only seems true for R^n with the standard dot product. Like for example, taking the space to be the space to be Pn(X) and <,> to be the integral of f*g.
 


Thank you for your responses, I think I have proved that for real unitary matrices, the transpose really is just the transpose. here is my proof if anyone can check or is interested.

Lemma: If U is unitary real, then it commutes with the matrix rep. of a nondegenerate bilinear form.
Pf: Let U be unitary, and let B be a nondegenerate bilinear form. Then

B(Uw,UV)=t(Uw)B'(Uv) for some matrix representation of B

so (tw*tU)B'(Uv)=(tw)*(tU*B'*U)*(v)

on the other hand B(Uw,Uv)=B(w,v), thus (tU*B'*U)=B',

however, since U is unitary, tU=U^-1, so (U^-1)*B'*U=B' and thus multiplying both sides by U, B'*U=U*B'. Thus U commutes with B'.

So If B is a symmetric bilinear form, tU really does equal the transpose.

Pf: Consider B(Uw,v)=B(v,Uw)=(tv)B'(Uw)=(tv)(B'U)w=(tv)(UB')w=t(tU*v)B'w=B((tU)v,w).

Sorry about the horrible notation due to my not using latex. Again, thank you Dick for pointing me towards this proof.
 


Ok, you're welcome! Glad you convinced yourself. I would conceptualize it by picking an orthonormal basis for the form using Gram-Schmidt, so <ei,ej>=delta_ij. So the matrix of U is U_ij=<ei,Uej>. So tU_ij=<ei,tUej>=<Uei,ej>=<ej,Uei>=U_ji. Or did I get my indices backwards? Doesn't matter, you get my point, right? So in the complex case it's pretty obvious the adjoint is the conjugate transpose.
 


Dick said:
Ok, you're welcome! Glad you convinced yourself. I would conceptualize it by picking an orthonormal basis for the form using Gram-Schmidt, so <ei,ej>=delta_ij. So the matrix of U is U_ij=<ei,Uej>. So tU_ij=<ei,tUej>=<Uei,ej>=<ej,Uei>=U_ji. Or did I get my indices backwards? Doesn't matter, you get my point, right? So in the complex case it's pretty obvious the adjoint is the conjugate transpose.

Yes, I think this revealed a lot to me. I think that I was just being very cautious, because the notation was very suggestive, and so I went into some sort of lockdown mode. I do like the way of conceptualizing this that you presented, and now I do understand that in the complex case it is conjugate transpose, so thank you much.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
Replies
17
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
7K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
7K