Quantum state representation

I've started few days ago to study quantum physics, and there's a thing which isn't clear to me. I know that a quantum state is represented by a ray in a Hilbert space (so that ##k \left| X \right\rangle## is the same state of ##\left| X \right\rangle##). Suppose now to have these three states:
##\left| A \right\rangle = \frac{1}{\sqrt{2}} (\left| \uparrow \right\rangle + \left| \downarrow \right\rangle)##
##\left| B \right\rangle = \frac{1}{\sqrt{2}} (\alpha \left| \uparrow \right\rangle + \beta \left| \downarrow \right\rangle)##
##\left| C \right\rangle = \frac{e^{i \phi}}{\sqrt{2}} (\left| \uparrow \right\rangle + \left| \downarrow \right\rangle)##
(where ##\alpha , \beta \in \mathbb{C}## and ##\phi \in \mathbb{R}##)
Question is, are these states rapresenting the same quantum state?

Answers and Replies

Khashishi
Science Advisor
It depends on the values of ##\alpha##, ##\beta##, and ##\phi##. Also, in an isolated system, an overall phase doesn't matter. But relative phases between two objects do.

It depends on the values of ##\alpha##, ##\beta##, and ##\phi##. Also, in an isolated system, an overall phase doesn't matter. But relative phases between two objects do.
Many thanks Khashishi.
I was wondering another questinon, which of these (##\left| A \right\rangle##, ##\left| B \right\rangle##, ##\left| C \right\rangle##) states best represent the combination of the two states ##\left| \uparrow \right\rangle , \left| \downarrow \right\rangle##?

Khashishi
Science Advisor
They all do.

They all do.
My apologies, i didn't explain myself correctly.

For example, suppose to have neutrons (spin ##\frac{1}{2}##) in a state ##\left| G \right\rangle## so that: ##S_x \left| G \right\rangle = \frac{\hbar}{2} \left| G \right\rangle##
and i want to write my state ##\left| G \right\rangle## as a combination of the two eigenvectors ##\left| \uparrow \right\rangle \left| \downarrow \right\rangle## of the operator ##S_z## (where ##S_z \left| \uparrow \downarrow \right\rangle =\pm \frac{\hbar}{2} \left| \uparrow \downarrow \right\rangle##).
In this case, what is the best representation (or the most general one) of ##\left| G \right\rangle##?
All the tree old states ##\left| A \right\rangle##, ##\left| B \right\rangle##, ##\left| C \right\rangle## can correctly represent ##\left| G \right\rangle##?

Khashishi
Science Advisor
I think the most general representation is ##\left|B\right>## with the additional constraint that ##|\alpha|^2 = 1, |\beta|^2 =1##. If there's nothing else in the system, we are free to choose values, and by convention we would choose ##\left|A\right>##, which is just ##\left|B\right>## with ##\alpha = \beta = 1##. Then the other eigenvalue ##S_x\left|G'\right>=-\frac{\hbar}{2}\left|G'\right>## needs to be chosen to be orthogonal to ##\left|G\right>##, which is conventionally ##\frac{1}{\sqrt{2}}\left(\left|\uparrow\right> - \left|\downarrow\right>\right)##, but doesn't have to be.

They're not representing exactly the same thing. Choosing different factors will affect what the operators look like, say, Pauli matrices will have different forms. But physically, we only care about probability, which is state vector inner product with itself. Then if we don't care about what the wave function is, we will have freedom to choose a complex phase in |C>.

vanhees71
Science Advisor
Gold Member
2019 Award
In posting #1, ##|A \rangle## and ##|C \rangle## represent the same state, because the one is just the other multiplied by a complex number. It's even only a phase factor if ##\phi \in \mathbb{R}##, which also doesn't change the overall normalization of the state. Since ##|\uparrow \rangle## and ##|\downarrow \rangle## are linearly independent (they are orthogonal to each other). Thus ##|B \rangle## can only represent the same state as ##|A \rangle## and ##B \rangle## if ##\alpha=\beta##.

Now to the question concerning the eigenvectors of ##\hat{\sigma}_x##. You work in the eigenbasis of ##\hat{\sigma}_z##, where
$$\hat{\sigma}_x=\frac{1}{2} \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}.$$
To solve the eigenvalue problem you start with the characteristic polynomial of this matrix to find the eigenvalues:
$$P(\lambda)=\mathrm{det} (\hat{\sigma}_x-\lambda \hat{1})=\lambda^2-1/4.$$
The eigenvalues are the zeros of this polynomial, i.e., there are 2 different eigenvalues
$$\lambda_{1,2}=\pm \frac{1}{2}.$$
To find the eigenvector for ##\sigma_x=+1/2## you have to solve
$$\hat{x} \begin{pmatrix} \alpha \\ \beta \end{pmatrix}=\frac{1}{2} \begin{pmatrix} \alpha \\ \beta \end{pmatrix}.$$
You only need the first component of this equation:
$$\frac{1}{2}\beta=\frac{1}{2} \alpha.$$
Thus you get
$$|\sigma_x \rangle=1/2 \rangle=\alpha(|\uparrow \rangle + |\downarrow \rangle).$$
Supposed the ##\sigma_z## eigenbasis is orthonormal, you can choose ##\alpha=1/\sqrt{2}## to normalize the eigenvector,
$$\langle \sigma_x | \sigma_x \rangle=1.$$
For the other eigenvector for the eigenvalue ##\sigma_x=-1/2## you should do the calculation yourself.

Thank you all for the help :)