Register to reply

Representations, states and tensors

by spookyfish
Tags: lie algebra, represenation, tensor
Share this thread:
spookyfish
#1
Jul11-14, 11:53 AM
P: 48
Hi. I am currently studying about representations of Lie algebras. I have two questions:

1. As I understand, when we say a "representation" in the context of Lie algebras, we don't mean the matrices (with the appropriate Lie algebra) but rather the states on which they act. But then, the states contain less information than matrices, since there could be several representations (for a given dimension) acting on the same state.

I am trying to understand why these states contain the same information as matrices. Is it because by "states" we actually mean weight vectors, which - in addition to containing vectors - contain the eigenvalues, so we could build up the matrices from them, and therefore they are equivalent to finding the generator matrices?

2. When we speak about "tensor representations" (for example, SU(n) tensors, which can also be described by Young Tableaux), what is their relation to the representations I mentioned in question 1 above?

Thanks.
Phys.Org News Partner Physics news on Phys.org
Optimum inertial self-propulsion design for snowman-like nanorobot
The Quantum Cheshire Cat: Can neutrons be located at a different place than their own spin?
A transistor-like amplifier for single photons
samalkhaiat
#2
Jul11-14, 02:16 PM
Sci Advisor
P: 892
Quote Quote by spookyfish View Post
Hi. I am currently studying about representations of Lie algebras. I have two questions:

1. As I understand, when we say a "representation" in the context of Lie algebras, we don't mean the matrices (with the appropriate Lie algebra) but rather the states on which they act. But then, the states contain less information than matrices, since there could be several representations (for a given dimension) acting on the same state.

I am trying to understand why these states contain the same information as matrices. Is it because by "states" we actually mean weight vectors, which - in addition to containing vectors - contain the eigenvalues, so we could build up the matrices from them, and therefore they are equivalent to finding the generator matrices?

2. When we speak about "tensor representations" (for example, SU(n) tensors, which can also be described by Young Tableaux), what is their relation to the representations I mentioned in question 1 above?

Thanks.
If I write
[tex][2] \otimes [2] = [3] \oplus [1][/tex]
Do you know the meaning of those numbers and their connection to "representation spaces", "states", "matrices" and "tensor representation" of [itex]SU(2)[/itex] and its algebra?
spookyfish
#3
Jul11-14, 03:25 PM
P: 48
I know the connection to states: the LHS is the direct product of 2D states and the RHS is the 3D and 1D states corresponding to the irreps.

samalkhaiat
#4
Jul11-14, 05:33 PM
Sci Advisor
P: 892
Representations, states and tensors

Ok, [itex][2][/itex] is the smallest dimensional (defining) representation space, i.e., a 2-dimensional linear vector space [itex]V^{ (2) }[/itex]. The elements of such space are 2-component vectors [itex]v_{ i }[/itex]. The [itex]2 \times 2[/itex] matrices of [itex]SU(2)[/itex] act naturally on [itex]v_{ i }[/itex] mixing its components (states).
So, [itex][2] \otimes [2][/itex] is the 4-dimentional (reducible) representation space [itex]V^{ (4) }[/itex] which contains 2 invariant (irreducible) subspaces [itex]V^{ (3) }[/itex] and [itex]V^{ (1) }[/itex]. Again, you can think of the elements of [itex][3][/itex] (3 states) as 3-component vectors or rank-2 symmetric tensor. And the elements of the space [itex][1][/itex] (one state) as 1-component scalar or anti-symmetric rank-2 tensor.
So, we can translate [itex]V^{ (4) } = V^{ (3) } \oplus V^{ (1) }[/itex] into a statement about reducing the tensor product [itex]u_{ i } v_{ j } \equiv T_{ i j } \in V^{ (4) }[/itex] into irreducible tensors. Simply, decompose the tensor product into symmetric and anti-symmetric tensors:
[tex]u_{ i } v_{ j } = \frac{ 1 }{ 2 } ( u_{ i } v_{ j } + u_{ j }v_{ i } ) + \frac{ 1 }{ 2 } ( u_{ i } v_{ j } - u_{ j } v_{ i } ) ,[/tex]
or
[tex]T_{ i j } = T_{ ( i j ) } + T_{ [ i j ] }[/tex]
For SU(2), the (irreducible) symmetric tensor [itex]T_{ ( i j )}[/itex] has 3 independent components (3 states) and the anti-symmetric tensor [itex]T_{ [ i j] }[/itex] has only one component (one state). The (3 + 1) states, vectors or tensors do not mix.

Sam
spookyfish
#5
Jul11-14, 10:36 PM
P: 48
Thanks. That answers my second question, i.e. connect the tensors to the state representations. So the irreducible representations are represented by symmetric and antisymmetric tensors. But I still don't know why this is true. I read that in general the irreps of SU(N) are tensors with definite symmetry and antisymmetry, but I don't know why this is true.

Also, I would like to get an answer to my question 1: why the states are the representations in the first place? I thought representations were supposed to be matrices (generators) satisfying the Lie algebra.
The_Duck
#6
Jul11-14, 10:53 PM
P: 844
Quote Quote by spookyfish View Post
Also, I would like to get an answer to my question 1: why the states are the representations in the first place? I thought representations were supposed to be matrices (generators) satisfying the Lie algebra.
We often use the same word "representation" to refer both to a set of matrices that satisfy the Lie algebra commutation relations and to the vector space on which those matrices act.
strangerep
#7
Jul12-14, 04:16 AM
Sci Advisor
P: 1,905
Quote Quote by spookyfish View Post
why the states are the representations in the first place? I thought representations were supposed to be matrices (generators) satisfying the Lie algebra.
Strictly speaking, the "representation" is the mapping from the abstract generators to specific matrices acting on a linear space.
Cf. this Wiki page.

People often say that the matrices "represent" the abstract group generators. Similarly, the linear space is sometimes said to be the "carrier space" for the representation, or that it "carries the representation".
spookyfish
#8
Jul12-14, 04:57 AM
P: 48
Quote Quote by The_Duck View Post
We often use the same word "representation" to refer both to a set of matrices that satisfy the Lie algebra commutation relations and to the vector space on which those matrices act.
Why are the states as useful in describing the group structure as the matrices themselves? is it because it easy to build the matrices from the states (using weights and roots)?
samalkhaiat
#9
Jul12-14, 01:44 PM
Sci Advisor
P: 892
Quote Quote by spookyfish View Post
But I still don't know why this is true. I read that in general the irreps of SU(N) are tensors with definite symmetry and antisymmetry, but I don't know why this is true.
It is easy to check that the (anti-)symmetry property of a tensor is an invariant property, i.e., the group maps (anti-)symmetric onto (anti-)symmetric tensor: they never mix.

Also, I would like to get an answer to my question 1: why the states are the representations in the first place? I thought representations were supposed to be matrices (generators) satisfying the Lie algebra.
I thought I’ve answered that already! What is the difference between “states” (in the linear Hilbert space) and “vectors” (in the linear representation space)? The group can be represented by unitary operators acting on the Hilbert (representation) space. In QM-1 you learn that you can (almost always) set up a 1-to-1 correspondence between states and vectors on one hand and operators and matrices on the other hand.
In general, a Lie algebra admits matrix as well as operator representations.

Sam
spookyfish
#10
Jul12-14, 03:32 PM
P: 48
Quote Quote by samalkhaiat View Post
I thought I’ve answered that already! What is the difference between “states” (in the linear Hilbert space) and “vectors” (in the linear representation space)? The group can be represented by unitary operators acting on the Hilbert (representation) space. In QM-1 you learn that you can (almost always) set up a 1-to-1 correspondence between states and vectors on one hand and operators and matrices on the other hand.
In general, a Lie algebra admits matrix as well as operator representations.

Sam
Unfortunately that was not my question. I know that states are vectors. My question was - why finding vectors and not the matrices? I thought that our goal in representation theory was to find matrix representations of groups
samalkhaiat
#11
Jul12-14, 05:47 PM
Sci Advisor
P: 892
No, as well as the generators (matrices or operators) you also need the eigen-values and eigen-vectors of the invariant Casmir operators of the group. Pauli matrices alone do not let you formulate theory for spin 1/2 particle. You also need the simultaneous eigen-vectors of [itex]S^{ 2 }[/itex] and [itex]S_{ z }[/itex].
spookyfish
#12
Jul12-14, 11:00 PM
P: 48
Quote Quote by samalkhaiat View Post
No, as well as the generators (matrices or operators) you also need the eigen-values and eigen-vectors of the invariant Casmir operators of the group. Pauli matrices alone do not let you formulate theory for spin 1/2 particle. You also need the simultaneous eigen-vectors of [itex]S^{ 2 }[/itex] and [itex]S_{ z }[/itex].
Fine, but that still does not answer my question. I will appreciate if someone will be able to take the time and try to answer my previous question.


Register to reply

Related Discussions
QM I - Decomposition of countable basic states into coherent states Advanced Physics Homework 1
Stationary States and time-independent states (aren't they the same?) Quantum Physics 4
Help clarify the probability of pure states and mixture of states Quantum Physics 5
Quantum Mechanics, Glauber Coherent States of QHO, and superpositions of states Advanced Physics Homework 0
Vector Representations of Quantum States Quantum Physics 15