Finding principal axes of electromagnetic stress tensor

Dazed&Confused
Messages
190
Reaction score
3

Homework Statement


In a certain system of units the electromagnetic stress tensor is given by M_{ij} = E_iE_j + B_i B_j - \frac12 \delta_{ij} ( E_kE_k + B_kB_k)
where E_i and B_i are components of the 1-st order tensors representing the electric and magnetic fields \bar{E} and \bar{B}, respectively.

b) For |E| = |B| (but \bar{E} \neq \bar{B}):
show that \bar{E} \pm \bar{B} are principal axes of the tensor M.

Homework Equations

The Attempt at a Solution



I get that this is related to diagonalisation of matrices, but I am not sure how to apply that knowledge in this case. The lecture notes I have make no mention of principal axes or diagonalisation. Not at all sure how this is to be done. Any hints?
 
Physics news on Phys.org
To say that ##
\bar{E} \pm \bar{B}
## are principal axes of ##M## means they are eigenvectors of ##M##. So what you have to prove is that there exist scalars ##\lambda_1,\lambda_2## such that:

$$M(\bar{E}+\bar{B})=\lambda_1(\bar{E}+\bar{B})$$
and
$$M(\bar{E}-\bar{B})=\lambda_2(\bar{E}-\bar{B})$$

By the way, your formula for ##M_{ij}## is wrong. It uses a subscript ##k## that is undefined.
 
Last edited:
  • Like
Likes Dazed&Confused
andrewkirk said:
By the way, your formula for ##M_{ij}## is wrong. It uses a subscript ##k## that is undefined.
There's probably an implied summation since ##k## appears twice in each term.
 
  • Like
Likes Dazed&Confused
Thanks. Yes they are an implied summation.

For ## \bar{E} + \bar{B} ##, the ## \lambda## I got was ##E_kB_k##.
 
For ## \bar{E} - \bar{B}##, the eigenvalue was ##-E_kB_k##. I'll post the other parts of the question:

c) determine the third principal axis and

d) find all principal values

I did part d) by finding the trace of the tensor which is the sum of the eigenvalues ( I think! ), so the last principal value is ##E_kE_k##. Is this correct?
 
Dazed&Confused said:
the last principal value is ##E_kE_k##. Is this correct?
The matrix M and the first two eigenvectors and eigenvalues are all symmetric between B and E, so it seems unlikely that the third eigenvalue would break that symmetry, since any derivation that gives that could equally give ##B_kB_k##.

For (c) you could use the fact that eigenvectors of a linear operator must all be mutually orthogonal.
 
  • Like
Likes Dazed&Confused
I used the fact that ##E_kE_k = B_kB_k##, since this course section is mainly about Cartesian tensors.
 
I forgot about that bit. Fair enough then.

A symmetric expression for what you've used is ##\frac{1}{2}(|\bar{E}|+|\bar{B}|)##.
 
  • Like
Likes Dazed&Confused
So if I take the cross product of the two vectors, and simplify, you get ##\bar{E} \times \bar{B}##. I'm not getting the previous eigenvalue with this.
 
  • #10
Are you sure? The cross product works for me. When I left-multiply it by M, I don't get an eigenvalue exactly like what you wrote in post 5, but not very different.
 
  • #11
Well I used the expression ## (\bar{E} \times \bar{B})_j = \epsilon_{jlm}E_l B_ m##.

##
 
  • #12
Strange, but I got ## -\frac16 ( |E|^2 + |B|^2) ##
 
Last edited:
  • #13
I got ##-\frac{1}{2}(|\bar{E}|+|\bar{B}|)##. When we calculate the first component of the vector multiplication we get that eigenvalue times the first component of ##\bar{E}\times\bar{B}## from the ##\delta## term in ##M##, and we get twelve other terms from the other two terms in ##M##, which match up neatly into six pairs each of which cancels out. Since both the cross product and the matrix are symmetrical between the three dimensions, we know the same must happen for the other two components.
 
  • Like
Likes Dazed&Confused
  • #14
I'll write it out: ##(E_iE_j + B_iB_j) \epsilon_{jlm}E_lB_m -\frac12 \epsilon_{ilm}(E_jE_j + B_jB_j)E_lB_m ##. I multiplied this by ##\epsilon_{ilm}## to get. ##2(E_jE_j + B_jB_j)E_lB_m - 3 (E_jE_j + B_jB_j)E_lB_m.## The right hand side equals ## 6\lambda E_lE_m##.
 
  • #15
I've never used ##\epsilon_{jlm}## notation. I find it unnecessarily confusing, and it doesn't seem to shorten things in a helpful way.

I just explicitly wrote out the first component of the result as ##M_{11}C_1+M_{12}C_2+M_{13}C_3## where ##\bar{C}## is the cross product, writing ##C_1=E_2B_3-E_3B_2## etc, inserted the explicit expressions for each ##M_{1j}## in terms of E and B, expanded it using the distributive law and then started cancelling. It didn't take long.
 
  • #16
By the way, it should be -1/2. I made a mistake working out the trace. Your method of finding the last eigenvalue looks like a nightmare to me, but I can't find the mistake in mine.

It should be ##M_{ij}x_j = \lambda x_i##, right?
 
  • #17
Something must be wrong with my previous method, but I can rewrite ## (E_iE_j + B_iB_j) \epsilon_{jlm}E_lB_m## as ##E_i B_m
\epsilon_{mjl} E_jE_l + B_i E_l \epsilon_{lmj} B_m B_j ## which equals ##\bar{E} (\bar{B} \cdot (\bar{E} \times \bar{E})) + \bar{B} (\bar{E} \cdot (\bar{B} \times \bar{B})) = 0##.
 
  • #18
Dazed&Confused said:
It should be ##M_{ij}x_j = \lambda x_i##, right?
That's right.

Does it all work out now?
 
  • #19
I would still like to understand where my previous attempt went wrong, but I'll ask the lecturer this. Thanks for all the help.
 
Back
Top