States and Group: Eigenvectors Represent One Dimension

In summary: But I should have stated that more explicitly.)In summary, there is a one-to-one correspondence between a set of basis vectors being eigenvectors of an operator and providing a one dimensional representation of that operator in the vector space. If the operator is self-adjoint, it defines a unitary representation of a one-parameter Lie group. However, this representation is only valid for operators with a purely discrete spectrum, and for Hermitian operators with a continuous spectrum, a more complex representation involving integrals is needed.
  • #1
kent davidge
933
56
Suppose a set of basis vectors are eigenvectors of some operator. So they will provide a one dimensional representation of that operator in the vector space?
 
Physics news on Phys.org
  • #2
If the operator ##\hat{A}## is self-adjoint, then it defines a unitary representation of a one-parameter Lie group via
$$\hat{U}=\exp(-\mathrm{i} \lambda \hat{A}).$$
It's not clear to me what you mean by "representation of that operator in vector space". A representation of an operator is usually the realization of the operator in a concrete realization of the (up to isomorphy unique) separable Hilbert space. E.g., if you choose the position representation, you work with position-wave functions, ##\psi(\vec{x})## and the representation of the components of the momentum are the derivatives ##\hat{p}_j=-\mathrm{i} \hbar \partial_{j}##.

If you work in the "##A## representation", i.e., with the wave functions which are "components" wrt. to the complete orthonormal set of eigenvectors (or generalized eigenvectors if the operator has continuous parts in its spectrum or if the spectrum is even entirely continuous), ##\psi(a)=\langle a|\psi \rangle##. Then the representation of ##\hat{A}## is simply multiplication by ##a## since
$$\hat{A} \psi(a)=\langle a |\hat{A} \psi \rangle = \langle \hat{A} a |\psi \rangle=a \langle a|\psi \rangle.$$
That's of course also the case in the position representation. The components of the position vector are represented by multiplication of the wave function with the component, ##\hat{x}_j \psi(\vec{x})=x_j \psi(\vec{x})##.
 
  • Like
Likes bhobba, dextercioby and kent davidge
  • #3
kent davidge said:
Suppose a set of basis vectors are eigenvectors of some operator. So they will provide a one dimensional representation of that operator in the vector space?
Then on each 1-dimensional eigenspace one has a 1-dimensional representation.
 
  • Like
Likes dextercioby, kent davidge and vanhees71
  • #4
vanhees71 said:
If the operator ##\hat{A}## is self-adjoint, then it defines a unitary representation of a one-parameter Lie group via
$$\hat{U}=\exp(-\mathrm{i} \lambda \hat{A}).$$
isn't that the case only if the group is additive?
 
  • #5
Isn't this implied by the definition of "one-parameter Lie group"?
 
  • #6
vanhees71 said:
Isn't this implied by the definition of "one-parameter Lie group"?
oh yea, sorry
 
  • #7
kent davidge said:
Suppose a set of basis vectors are eigenvectors of some operator. So they will provide a one dimensional representation of that operator in the vector space?
I'm guessing your underlying question is whether the operator can be represented in terms of the eigenvectors. For a self-adjoint operator ##A## on a Hilbert space, with eigenvectors ##|a\rangle##, where ##a## is a eigenvalue of ##A##, one can represent $$A ~=~ \sum_a a\, |a\rangle\langle a| ~.$$ This is a simple version of "the spectral theorem". More general versions are available for more general operators, i.e., operators that are not self-adjoint, but satisfy certain other property(ies). I could sketch more detail if indeed that was what you were really asking. (?)
 
Last edited:
  • #8
You mean
$$\hat{A}=\sum_a a |a \rangle \langle a|.$$
Your expression is, due to the completeness of the eigenkets
$$\sum_a |a \rangle \langle a|=\hat{1}.$$
 
  • Like
Likes strangerep and kent davidge
  • #9
@strangerep that's not exactly what I was meaning to ask, but I will come up with a better description of my doubts in a further thread.
 
  • #10
vanhees71 said:
You mean [...]
Oops! Yes -- thank you.

(Post corrected.)
 
  • #11
strangerep said:
I'm guessing your underlying question is whether the operator can be represented in terms of the eigenvectors. For a self-adjoint operator ##A## on a Hilbert space, with eigenvectors ##|a\rangle##, where ##a## is a eigenvalue of ##A##, one can represent $$A ~=~ \sum_a a\, |a\rangle\langle a| ~.$$ This is a simple version of "the spectral theorem".
This is valid for self-adjoint operators with a purely discrete spectrum only. For Hermitian operators with a continuous spectrum there are no eigenvectors. Or you need to embed the Hilbert space into the dual of a nuclear space and get an analogous representation involving distribution-valued bras and kets and integrals in place of the sum. In the mixed spectrum case you need a combination of sums and integrals.
 
  • #12
A. Neumaier said:
This is valid for self-adjoint operators with a purely discrete spectrum only. [...]
Yes, I know. The intent of my (incomplete) post was merely to try and zero in on what the OP was really asking about.
 

1. What is an eigenvector?

An eigenvector is a vector that does not change direction when a linear transformation is applied to it. It only changes in magnitude (scaling factor).

2. How is an eigenvector related to a state or group?

Eigenvectors are often used to represent states or groups in linear algebra. In this context, they represent the "stable" or "fixed" components of a system that do not change under a given transformation.

3. What does it mean for an eigenvector to have one dimension?

Eigenvectors with one dimension have only one component and are represented as a single column vector. This means that the vector only changes in magnitude, not direction, when a transformation is applied.

4. How are eigenvectors calculated?

Eigenvectors are calculated by finding the non-zero solutions to a system of linear equations, where the transformation matrix and its eigenvectors are known. This is typically done using matrix algebra and linear algebra techniques.

5. What is the significance of eigenvectors in science?

Eigenvectors have many applications in science, particularly in fields such as physics, engineering, and computer science. They are used to analyze and understand systems that involve linear transformations, such as quantum mechanics and signal processing. They also have practical applications in data analysis and data compression.

Similar threads

  • Quantum Physics
Replies
2
Views
921
  • Quantum Physics
Replies
24
Views
593
Replies
6
Views
831
  • Quantum Physics
Replies
2
Views
1K
Replies
20
Views
1K
  • Quantum Physics
Replies
7
Views
1K
  • Quantum Physics
Replies
6
Views
1K
  • Quantum Physics
Replies
7
Views
883
Replies
9
Views
1K
Replies
3
Views
819
Back
Top