Basic Questions in linear algebra and group theory

Heisenberg1993
Messages
12
Reaction score
0
1- How can infer from the determinant of the matrix if the latter is real or complex?
2- Can we have tensors in an N-dimensional space with indices bigger than N?
 
Physics news on Phys.org
1. I don't think you can. Is there a reason why you're asking?

2. No. The components of a tensor are the output you get when you take basis vectors as input. So the indices label basis vectors as well as tensor components. For example, the components of the metric tensor at a point p are given by ##g_{ij}(p)=g_p(e_i,e_j)##. (The metric tensor field ##g## takes each point ##p## to a "metric at p", which I'm denoting by ##g_p##).
 
Fredrik said:
1. I don't think you can. Is there a reason why you're asking?

2. No. The components of a tensor are the output you get when you take basis vectors as input. So the indices label basis vectors as well as tensor components. For example, the components of the metric tensor at a point p are given by ##g_{ij}(p)=g_p(e_i,e_j)##. (The metric tensor field ##g## takes each point ##p## to a "metric at p", which I'm denoting by ##g_p##).
Thanks for the reply
concerning the first question, in A. Zee's book "Quantum Field Theory in a nutshell", it is stated that "any orthogonal matrix can be written as O=eA. from the conditions that OTO=1 and det(O)=1, we can infer that A is real and anti-symmetric." From the first condition, I deduced anti-symmetry. However, I didn't know how to deduce that A is real from the second. That's why I asked the question.
Concerning the second question, can't I define a 3-rank tensor, which takes 3 basis vectors as an input, in a two dimensional space? I will get two indices with the same value, but what's wrong with that(assuming the tensor is non anti-symmetric)
 
Heisenberg1993 said:
Thanks for the reply
concerning the first question, in A. Zee's book "Quantum Field Theory in a nutshell", it is stated that "any orthogonal matrix can be written as O=eA. from the conditions that OTO=1 and det(O)=1, we can infer that A is real and anti-symmetric." From the first condition, I deduced anti-symmetry. However, I didn't know how to deduce that A is real from the second. That's why I asked the question.
I'm puzzled by this too. The term "orthogonal" is only used for matrices whose components are real numbers. So wouldn't a theorem that says that there's an A such that ##O=e^A## mean that the components of A are real? When we're dealing with a real vector space, every matrix has real components unless we explicitly say otherwise.

If the theorem says that A may be complex, then it makes sense to try to prove that the conditions imply that A is real. But I don't see how to prove it either.

Heisenberg1993 said:
Concerning the second question, can't I define a 3-rank tensor, which takes 3 basis vectors as an input, in a two dimensional space?
You can. You will have three indices that each take values in the set {1,2}.
 
  • Like
Likes Heisenberg1993
Heisenberg1993 said:
1- How can infer from the determinant of the matrix if the latter is real or complex?
[...]

1. No. Check out the matrix diag (##e^{i\phi}, e^{-i\phi}##), ##\phi\in\mathbb{R}^*##. It has complex non-zero entries with 1 as determinant.
 
Fredrik said:
I'm puzzled by this too. The term "orthogonal" is only used for matrices whose components are real numbers. So wouldn't a theorem that says that there's an A such that ##O=e^A## mean that the components of A are real? When we're dealing with a real vector space, every matrix has real components unless we explicitly say otherwise.
If the theorem says that A may be complex, then it makes sense to try to prove that the conditions imply that A is real. But I don't see how to prove it either.

You can. You will have three indices that each take values in the set {1,2}.

So in a sense we don't need the second condition to show that A is real, just the fact that we used the transpose of the matrix is enough. If the m,atrix was complex, then we should have used the complex adjoint of it, right?
 
The adjoint is more useful when we're working with complex matrices, but we can certainly transpose a complex matrix if we want to. However, the Wikipedia definition of "orthogonal" is

In linear algebra, an orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors (i.e., orthonormal vectors), i.e.
$$Q^TQ=QQ^T=I,$$ where ##I## is the identity matrix.
Unfortunately this doesn't imply that A is real in any obvious way. But it makes me wonder if the theorem that Zee is using to rewrite ##O## as ##e^A## really says that we may need an A that isn't real. It seems that this would have been stated explicitly in the theorem, and in that case, Zee should have stated it explicitly too.
 
I don't think Zee is that much careful in talking about math. So maybe reading another book covering the same subject helps.
 
Theorem 2.9 in "Lie groups, Lie algebras, and representations" by Brian C. Hall says that for every invertible matrix M, there's a complex matrix A such that ##M=e^A##. I'm going to take a look at the proof and see if I can understand what's going on there.
 
  • Like
Likes Heisenberg1993
Back
Top