# Irreducible representation of tensor field

• su-ki
In summary, Mark Srednicki's book "Quantum Field Theory" discusses a tensor field without a particular symmetry that can be written as: A^{αβ}+S^{αβ}+(1/4)g^{αβ}T(x).
su-ki
In Mark Srednicki's book "Quantum Field Theory"
He says that a tensor field $B^{αβ}$ with no particular symmetry can be written as :-

$B^{αβ} = A^{αβ} + S^{αβ} + (1/4) g^{αβ} T(x)$ Equn. 33.6

where A - Antisymmetric, S = symmetric and T(x) = trace of $B^{αβ}$ .

Is there any reason for explicit addition of trace term ?
Coz generally we split things into symmetric and antisymmetric parts and trace is included in symmetric part.

Last edited:
If you're talking about the general linear group GL(n), the irreducible representations are the tensors whose indices have been symmetrized in a particular way. When you go to the orthogonal group, there are fewer transformations in the group, and some of these representations are no longer irreducible. The operation of contraction (forming a trace) commutes with the orthogonal transformations.

su-ki said:
In Mark Srednicki's book "Quantum Field Theory"
He says that a tensor field $B^{αβ}$ with no particular symmetry can be written as :-

$B^{αβ} = A^{αβ} + S^{αβ} + (1/4) g^{αβ} T(x)$ Equn. 33.6

where A - Antisymmetric, S = symmetric and T(x) = trace of $B^{αβ}$ .

Is there any reason for explicit addition of trace term ?
Coz generally we split things into symmetric and antisymmetric parts and trace is included in symmetric part.

This is true only if the symmetric part is traceless. In general, for any rank-2 tensor, we write
$$B^{ ab } = B^{ (ab) } + B^{ [ab] } .$$
Then we take the symmetric part and decompose it as
$$B^{ (ab) } = \left( B^{ (ab) } - \frac{ 1 }{ 4 } g^{ ab } B \right) + \frac{ 1 }{ 4 } g^{ ab } B .$$
The tensor (call it $S^{ ab }$) in the bracket on the left-hand side is symmetric and traceless, because
$$B = \mbox{ Tr } ( B^{ (ab) } ) = g_{ ab } B^{ (ab) }$$
So, our original tensor can now be written as
$$B^{ ab } = A^{ ab } + S^{ ab } + \frac{ 1 }{ 4 } g^{ ab } B ,$$
where $A^{ ab } = - A^{ ba } \equiv B^{ [ab] }$

See posts #24 and 25 in

Sam

yea, i got just it, thank u :)

## 1. What is an irreducible representation of a tensor field?

An irreducible representation of a tensor field is a mathematical concept used to describe the behavior of a tensor under a specific transformation. It is a way of breaking down a tensor into simpler components that behave in a predictable and consistent manner.

## 2. How is an irreducible representation different from a reducible representation?

An irreducible representation is a decomposition of a tensor into its most basic components, while a reducible representation breaks a tensor down into multiple smaller components. Irreducible representations are often preferred because they provide a more fundamental understanding of the tensor's behavior.

## 3. What is the significance of irreducible representations in tensor analysis?

Irreducible representations are important in tensor analysis because they allow for a more efficient and systematic analysis of tensors. By breaking down a tensor into its irreducible components, complex operations can be simplified and a deeper understanding of the tensor's behavior can be gained.

## 4. How are irreducible representations determined?

Irreducible representations are determined through the use of group theory and symmetry operations. By identifying the symmetry operations that preserve the tensor's properties, the tensor can be decomposed into its irreducible components.

## 5. Can an irreducible representation be unique?

Yes, an irreducible representation can be unique for a specific tensor. However, there may be multiple irreducible representations for different tensors that have similar symmetry properties. It is important to carefully consider the properties and symmetry of the tensor when determining its irreducible representation.

Replies
22
Views
2K
Replies
4
Views
1K
Replies
15
Views
6K
Replies
5
Views
3K
Replies
1
Views
1K
Replies
6
Views
3K
Replies
27
Views
2K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
1
Views
2K