Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Basic Questions in linear algebra and group theory

  1. Nov 30, 2014 #1
    1- How can infer from the determinant of the matrix if the latter is real or complex?
    2- Can we have tensors in an N-dimensional space with indices bigger than N?
     
  2. jcsd
  3. Nov 30, 2014 #2

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    1. I don't think you can. Is there a reason why you're asking?

    2. No. The components of a tensor are the output you get when you take basis vectors as input. So the indices label basis vectors as well as tensor components. For example, the components of the metric tensor at a point p are given by ##g_{ij}(p)=g_p(e_i,e_j)##. (The metric tensor field ##g## takes each point ##p## to a "metric at p", which I'm denoting by ##g_p##).
     
  4. Nov 30, 2014 #3
    Thanks for the reply
    concerning the first question, in A. Zee's book "Quantum Field Theory in a nutshell", it is stated that "any orthogonal matrix can be written as O=eA. from the conditions that OTO=1 and det(O)=1, we can infer that A is real and anti-symmetric." From the first condition, I deduced anti-symmetry. However, I didn't know how to deduce that A is real from the second. That's why I asked the question.
    Concerning the second question, can't I define a 3-rank tensor, which takes 3 basis vectors as an input, in a two dimensional space? I will get two indices with the same value, but what's wrong with that(assuming the tensor is non anti-symmetric)
     
  5. Dec 1, 2014 #4

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I'm puzzled by this too. The term "orthogonal" is only used for matrices whose components are real numbers. So wouldn't a theorem that says that there's an A such that ##O=e^A## mean that the components of A are real? When we're dealing with a real vector space, every matrix has real components unless we explicitly say otherwise.

    If the theorem says that A may be complex, then it makes sense to try to prove that the conditions imply that A is real. But I don't see how to prove it either.

    You can. You will have three indices that each take values in the set {1,2}.
     
  6. Dec 1, 2014 #5

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    1. No. Check out the matrix diag (##e^{i\phi}, e^{-i\phi}##), ##\phi\in\mathbb{R}^*##. It has complex non-zero entries with 1 as determinant.
     
  7. Dec 1, 2014 #6
    So in a sense we don't need the second condition to show that A is real, just the fact that we used the transpose of the matrix is enough. If the m,atrix was complex, then we should have used the complex adjoint of it, right?
     
  8. Dec 1, 2014 #7

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The adjoint is more useful when we're working with complex matrices, but we can certainly transpose a complex matrix if we want to. However, the Wikipedia definition of "orthogonal" is

    In linear algebra, an orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors (i.e., orthonormal vectors), i.e.
    $$Q^TQ=QQ^T=I,$$ where ##I## is the identity matrix.
    Unfortunately this doesn't imply that A is real in any obvious way. But it makes me wonder if the theorem that Zee is using to rewrite ##O## as ##e^A## really says that we may need an A that isn't real. It seems that this would have been stated explicitly in the theorem, and in that case, Zee should have stated it explicitly too.
     
  9. Dec 1, 2014 #8

    ShayanJ

    User Avatar
    Gold Member

    I don't think Zee is that much careful in talking about math. So maybe reading another book covering the same subject helps.
     
  10. Dec 1, 2014 #9

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Theorem 2.9 in "Lie groups, Lie algebras, and representations" by Brian C. Hall says that for every invertible matrix M, there's a complex matrix A such that ##M=e^A##. I'm going to take a look at the proof and see if I can understand what's going on there.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Basic Questions in linear algebra and group theory
  1. Basic group theory (Replies: 5)

Loading...