I Commutator's Matrix representation

Javier2808
Messages
2
Reaction score
0
TL;DR Summary
Misunderstanding the matrix representation of the [H,x] commutator
Hello!

I have checked commutator matrix form of $$\vec{p}=im/\hbar [H,\vec{x}]$$ but I realized i don't undertand something

I have $$[H,\vec{x}]=H\vec{x}-\vec{x}H$$ and

$$(H\vec{x})_{i}=H_{ij}x_j$$ & $$\ ( \vec{x}H)_{i}=x_jH_{ji}$$
but what is the second term matrix representation
$$\vec{x} H=
\left(
\begin{array}{c}
x_{1} \\
x_{2} \\
x_{3}
\end{array}
\right)
\left(
\begin{array}{ccc}
H_{11} & H_{12}& H_{13}\\
H_{21} & H_{22}& H_{23}\\
H_{31} & H_{32}& H_{33}\\
\end{array}
\right) = ?$$

Should I introduce the transpose Vector?
$$\vec{u}=\vec{x}^T H=
\left(
\begin{array}{ccc}
x_{1}&x_{2} &x_{3}
\end{array}
\right)
\left(
\begin{array}{ccc}
H_{11} & H_{12}& H_{13}\\
H_{21} & H_{22}& H_{23}\\
H_{31} & H_{32}& H_{33}\\
\end{array}
\right) = \left(
\begin{array}{ccc}
u_{1}&u_{2} &u_{3}
\end{array}
\right)
$$
it is a row vector but
$$\vec{w}=H \vec{x} =
\left(
\begin{array}{ccc}
H_{11} & H_{12}& H_{13}\\
H_{21} & H_{22}& H_{23}\\
H_{31} & H_{32}& H_{33}\\
\end{array}
\right) \left(
\begin{array}{c}
x{1} \\
x_{2} \\
x_{3}
\end{array}
\right)= \left(
\begin{array}{c}
w_{1} \\
w_{2} \\
w_{3}
\end{array}
\right)$$
is a column vector, I could not sum them. What is wrong?


Thanks for your help
 
Physics news on Phys.org
You take H matrix and x and p vectors. All three are operators and expressed as matrices.
\mathbf{H}=\int|e><e|de
\mathbf{x}=\int|x><x|dx
\mathbf{p}=\int|p><p|dp
for example of having continuous eigenvectors.
 
im a little comfuse because we also use of $$X_x, X_y, X_z$$ are they also Matrix?
 
Last edited:
I am not sure what you mean with large or small letters and suffices.
As long as 3 dimension of x and p
\mathbf{x}=\int|x><x|dx
\mathbf{y}=\int|y><y|dy
\mathbf{z}=\int|z><z|dz
\mathbf{p_x}=\int|p_x><p_x|dp_x
\mathbf{p_y}=\int|p_y><p_y|dp_y
\mathbf{p_z}=\int|p_z><p_z|dp_z
You can take each of them is an operator and also is a matrix.

Javier2808 said:
I have checked commutator matrix form of
→p=im/ℏ[H,→x]
I do not think this formula for momentum stands.

EDIT As pointed out in above formula ##|a><a|## is wrong to be modified to ##|a>a<a|## with insertion of eigenvalue.
 
Last edited:
mitochan said:
\mathbf{x}=\int|x&gt;&lt;x|dx
for example of having continuous eigenvectors.
No, that is the identity operator.
Once you have a set of complete orthonormal states, you can write any operator as \hat{A} = \int d \alpha d \beta \ |\alpha \rangle \langle \beta | \ A( \alpha , \beta ), in the continuous case, or as \hat{A} = \sum_{m,n} |m\rangle \langle n | \ A_{mn} , in the discrete case.
 
  • Like
Likes dextercioby and mitochan
Thanks for pointing out my mistakes.

In my formula |a><a| is wrong but |a>a<a| with insertion of eigenvalues.
 
Last edited:
I'm a bit puzzled about the somewhat complicated answers. All you need at this level of QM is the Heisenberg algebra of position and momentum operators (just saying that the momenta are the generators of spatial translations),
$$[\hat{x}_j,\hat{p}_k]=\mathrm{i} \hbar \delta_{jk}, \quad [\hat{x}_j,\hat{x}_k]=0, \quad [\hat{p_j},\hat{p}_k]=0,$$
and the general formula
$$[\hat{A} \hat{B},\hat{C}]=\hat{A}[\hat{B},\hat{C}]+[\hat{A},\hat{C}]\hat{B}.$$
Now you have something like
$$\hat{H}=\frac{1}{2m} \hat{\vec{p}}^2+V(\hat{\vec{x}}).$$
From this (with the Einstein summation convention used)
$$\mathring{\hat{x}}_j=\frac{\mathrm{i}}{\hbar} [\hat{H},\hat{x}_j] = \frac{\mathrm{i}}{2m \hbar} [\hat{p}_k \hat{p}_k,\hat{x}_j] = \frac{\mathrm{i}}{2m \hbar} \left (\hat{p}_k [\hat{p}_k,\hat{x}_j]+[\hat{p}_k \hat{x}_j] \right)=\frac{1}{m} \hat{p}_j,$$
as expected.

Here ##\mathring{\hat{x}_j}=\hat{v}_j## describes the operator representing the time derivative of the observable ##x_j##, i.e., the operator representing the corresponding velocity component.
 
Javier2808 said:
I have checked commutator matrix form of $$\vec{p}=im/\hbar [H,\vec{x}]$$ but I realized i don't undertand something
Where does this expression come from? Please give a reference. I have never seen vector arrows used like this so my guess is that you didn't transcribe the formula correctly. Given your knowledge level, I don't think that your problem is about three spatial dimensions but about one.

Javier2808 said:
I have $$[H,\vec{x}]=H\vec{x}-\vec{x}H$$ and

$$(H\vec{x})_{i}=H_{ij}x_j$$
No. You are confusing the Hilbert space of QM with the ordinary three-dimensional space. ##\hat X## (as well as ##\hat Y## and ##\hat Z##) and ##\hat H## are operators on the Hilbert space. Already for one spatial dimension the dimension of the Hilbert space is infinite, so neither of them can be represented by a finite matrix.

Also you seem to be confused about which objects are operators and which objects are not. There are different conventions for writing operators: upper case letters (like ##X##), hats (like ##\hat x##) or a combination of both (like ##\hat X##). The first convention is widely used but may confuse beginners sometimes because there are cases like ##H## which is also used for the classical Hamiltonian function. I recommend to always use hats for your own formulas in the beginning (like @vanhees71 did in post #7) but you need to be able to understand the conventions of other people, too.
 
Last edited:
Well, the notation ##\hat{\vec{x}}## is quite common, i.e., you can also write
$$\hat{\vec{v}}=\frac{1}{m} \frac{1}{\mathrm{i} \hbar} [\hat{\vec{x}},\hat{H}].$$
Hi just added the hats to emphasize that we deal with self-adjoint operators which represent the observables in the Dirac formalism of QT.
 
  • #10
Thanks for the information that the notation is indeed used. Since the OP's interpretation of it is wrong, I'm not sure if I still expect a transcription error or not.
 
  • #11
vanhees71 said:
I'm a bit puzzled about the somewhat complicated answers.
The OP seems to confuse the Cartesian index i = 1,2,3 with the QM indices \alpha \in \mathbb{R} and/or n \in \mathbb{Z}. This is why I did not reply directly to his posts (even though I actually answered his question).

All you need at this level of QM is the Heisenberg algebra ...
From this
\frac{\mathrm{i}}{\hbar} [\hat{H},\hat{x}_j] = \frac{1}{m} \hat{p}_j,
as expected.
You have just evaluated the commutator \big[ \hat{H} , \hat{x}_{i} \big]. But this is not what the OP asked you to do. The question was about the relationship between the operator \big[ \hat{H} , \hat{x}_{i} \big] and its matrix representations \big[ \hat{H} , \hat{x}_{i} \big](\alpha , \beta) or \big[ \hat{H} , \hat{x}_{i} \big]_{mn}. The answer follows from superposition principle, i.e., from the relation between the “vector” |\Psi \rangle and its “components” \Psi ( \alpha ) “along” the basis |\alpha \rangle: |\Psi \rangle = \int_{\mathbb{R}} d \alpha \ \Psi (\alpha) |\alpha \rangle . From this, you can easily show that any operator \hat{A} can be written (in term of its matrix representation A(\alpha , \beta)) as \hat{A} = \int_{\mathbb{R}^{2}} d \alpha d \beta \ A(\alpha , \beta ) |\alpha \rangle \langle \beta |, where A(\alpha , \beta) = \langle \alpha | \hat{A} |\beta \rangle.
 
  • Like
Likes vanhees71, Spinnor and dextercioby
  • #12
Ok, and this I should have gussed from #1?...
 

Similar threads

Back
Top