1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Unitary Matrices and Their Entry Values Proof

  1. Sep 19, 2015 #1


    User Avatar
    Gold Member

    1. The problem statement, all variables and given/known data

    Show that |A_ij| ≤ 1 for every entry A_ij of a Unitary Matrix A.

    2. Relevant equations

    A matrix is unitary when A^†*A=I
    Where † is the hermitian operator, meaning you Transpose and take the complex conjugate
    and I = the identity matrix

    3. The attempt at a solution
    I'm having a hard time starting this one out.
    It seems to make sense to me, as we need to find a way to make them equal the identity matrix.

    So we have something like:

    I'm not quite sure where to go in any direction, how I can get the necessary conditions applied to this proof.

    Any point of guidance may help.
  2. jcsd
  3. Sep 19, 2015 #2


    User Avatar
    Science Advisor
    Homework Helper

    The index notation for an identity matrix ##I## is ##I_{ij} = \delta_{ij}##. So, for a unitary matrix ##A##, ##A_{ij}A_{kj}^* = \delta_{ik}##, what can you conclude from this relation?
  4. Sep 19, 2015 #3
    Write the entries of the resultant matrix ##I = A^\dagger A## as ##\delta_{ij} = \sum_k \left[A^\dagger\right]_{ik}\cdot \left[A\right]_{kj}##.

    Here I used that ##\left[ I\right]_{ij} = \delta_{ij}##.
    Now you want to get rid of the hermitian conjugation, using what you said already.

    Does that help?

    Edit: Oops too slow
  5. Sep 19, 2015 #4


    User Avatar
    Gold Member

    Well, let's see.

    So we have the notation for the identity matrix.
    We use the summation formula for the components.

    We can apply the definition of the hermitian conjugation to the sum.

    We see the sum of [A_ki*][A_kj] = the delta element of the identity matrix.

    Since we are looking at the ij'th component, the delta element = 0.

    Sum[[A_ki*][A_kj]] = 0
    (A_kj)*=0 ??
  6. Sep 19, 2015 #5
    You want to look at the elements ##\delta_{ii}=1##.

    The zeroes (##\delta_{ij}\text{ with } i\neq j##) are quite useless in this case.
  7. Sep 19, 2015 #6


    User Avatar
    Gold Member

    hm, I see.



    This reminds me of the form e^(-itheta) to represent a complex number.
    Since A_kj is the kj'th element of the matrix, it is a number, a complex number that can be represented by e^(-itheta) in this case due to conjugation.

    This means Akj*=1=cos(x)-isin(x)
    Since we have no imaginary number here, sin(x)=0, and cos(x) is bounded by 1.

    Is this the correct way to go about this proof? Albeit, needs a bit of polishing?
  8. Sep 19, 2015 #7
    What you get is the following expression
    ##1 = \sum_k a^*_{ki}\cdot a_{ki} = \sum_k |a_{ki}|^2##

    Here I used that the modulus squared of a complex number is ##c## is given by ##|c|^2 = c^* c##.
    So we know that sum above equals one and that the numbers are non-negative.
    Can you finish this reasoning?
  9. Sep 19, 2015 #8


    User Avatar
    Gold Member

    Hm, what's throwing me off is the indexing. How did it become _ki for both of them? I can see the transpose definition made a_ik into a_ki, but why is the other index going from a_kj to a_ki?

    If ##1 = \sum_k |a_{ki}|^2##

    Then it's clear to see that a sum of numbers, squared must equal 1, must mean that the components of the sum add up to one or are less than one.
    consider it is all non-negative components.
  10. Sep 19, 2015 #9
    Because we are looking at elements on the diagonal.
    We started from ##\delta_{ii}##. Naturally this should be adopted in the sum as well.

    The conclusion is right by the way.
  11. Sep 19, 2015 #10


    User Avatar
    Gold Member

    Okay, I just wrote it down. I am starting to understand it more clearly now. It still is the most foreign one on this assignment, but let's see.

    1. Write down as entries of the matrix.
    ##\delta_{ij} = \sum_k \left[A^\dagger\right]_{ik}\cdot \left[A\right]_{kj}##.

    2. Next, note that we are looking at the interesting elements of the matrix, the diagonals where delta is not equal to 0, but instead equal to 1 of the identity matrix.
    ##delta_{ii} = \sum_k a^*_{ki}\cdot a_{ki} ##

    3. Note that delta_ii = 1 and that we can apply a property of complex conjugates so that we see the absolute value of the square of the a_ki'th component.
    ##1 = \sum_k |a_{ki}|^2##

    4. Now I use reasoning to simply state that, Hey, we are adding together components here, and then taking the square, and this must equal one, so the components must be less than or equal to 1.


    A pretty smooth proof.

    What threw me off was the indexing of the summation. I understand the delta summation part well. I'm still foggy on the summation process. So we are summing from k=1 to k=i? The rows? The columns?
  12. Sep 19, 2015 #11
    From k=1 to k=n. Here n is the number of rows and columns our square matrix has.

    With regards to the confusion, are you using the Einstein summation convention?
    Because we don't use it here.

    The sum-expression is how we can define matrix multiplication. You can try it for small matrices if you still think its fishy.
  13. Sep 19, 2015 #12


    User Avatar
    Gold Member

    Oh, no, I am using the same summation convention as you are, I believe.

    From k=1 to k=n. Here n is the number of rows and columns our square matrix has.

    This helped me out.
    I think I simply need more practice with summation definitions of matrix elements.

    Thank you kindly for your helping hand here.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted