Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Intuition behind superposition probabilities?

  1. Aug 22, 2013 #1
    Hi,
    This is silly, but I'm confused as to how we use the unit circle as a representation of particle states. I've been given the formula (probability distribution)=α|0> + β|1> (or in radians sin∅|0> + cos∅|1> or something), where the probability of a particle being in a certain state is the "amplitude" (which is...?) squared (so the probability of state 0 is α^2 as I understand it). At least that's what I think, there is a huge possibility I got it all wrong, I'm making a horrible blunder of this I do apologize. Anyways, it's obvious I am seriously lacking intuition behind these probability amplitudes, any help would be appreciated. Thanks!
     
  2. jcsd
  3. Aug 22, 2013 #2
    I think you are not too far off. This is a question about the very fundamentals of QM, so I apologize if I give you an answer that uses concepts that are too advanced, or if I delve into too much background info.

    Let's look at your expression α|0> + β|1>. This describes a system that, when we measure it, will either be found to be in the state |0> or the state |1>. As an intuitive example, we might say that it is a radioactive atom which can be observed to be either decayed (corresponding to |0>) or non-decayed (corresponding to |1>). The objects |0> and |1>, representing these observable states, are like the basis vectors in a 2-dimensional space (1,0) and (0,1). [To be more precise, they are the eigenvectors of the matrix operator corresponding to measuring the system's state.] We call |0> and |1> "eigenstates" to differentiate them from states like (1/√2)|0> + (1/√2)|1>, which are called "superpositions". When a state is in a nontrivial superposition α|0> + β|1> [nontrivial means neither α nor β are zero] this means that when we measure the system, it has a nonzero probability to be observed in either the state |0> or |1>. An eigenstate, on the other hand, will only be observed in that eigenstate with 100% probability, and 0% probability to be in the other eigenstate. For example, if the radioactive atom is in the state |0>, then we will certainly find that it is decayed, but if it is in the state (1/√2)|0> + (1/√2)|1>, we will have a 50% probability of finding it in the decayed state and a 50% probability of finding it in the non-decayed state.

    That takes care of the actual physical interpretation of QM.

    Now for the mathematical aspects. First, you said that the state α|0> + β|1> can equally well be expressed as cosθ|0> + sinθ|1>. The reason for this is because quantum states should be normalized--i.e. that they live on the unit circle in the space spanned by the basis vectors |0> and |1>. (However, sometimes we work with nonnormalized wavefunctions and then normalize them after. But for now assume everything should be normalized.) However, I would say this is not completely general because the numbers α and β can be complex numbers.

    This normalization procedure ensures that the probabilities for all the states add up to 1. The coefficients α and β are called "amplitudes" to distinguish them from "probabilities" because if the state of the system is α|0> + β|1>, then the probability for it to be measured in the state |0> is αα*, and the probability for it to be measured in the state |1> is ββ*--the notation α* means the complex conjugate of α. [Sometimes you see αα* written as |α|2, but be careful because this is not equal to α2 unless α is purely real.] So an amplitude is basically the complex root of the probability. For the state to be normalized, we need 1=P(|0>)+P(|1>)=αα*+ββ*
     
    Last edited: Aug 22, 2013
  4. Aug 22, 2013 #3
    That is a wonderful explanation thank you!

    Another question:

    Why do complex numbers pop up so often when describing particles in Quantum Mechanics?
     
  5. Aug 22, 2013 #4
    Complex numbers are fundamental to quantum mechanics. There is an i in the Schrodinger equation! One of the most fundamental equations in QM is the commutation relation [x,p]=i[strike]h[/strike].

    But there really is no definitive answer to that question. Feynman has a nice discussion of how "Why?" questions can land you in an infinite regress.

    One way I like to think shows that you NEED i in quantum mechanics has to do with the uncertainty relations and momentum-space wavefunctions. (Sorry if this is too advanced for you.) One nice form of the uncertainty principle is ΔxΔp≥[strike]h[/strike]/2. Suppose we have a particle with a precisely known momentum: Δp=0. That means Δx=∞, which indicates we have absolutely no idea where the particle is in space. P(x)=constant. [If it weren't constant, then we at least know SOMETHING about where it is.] If the spatial probability distribution were the most fundamental thing, then a particle of any momentum would have the exact same description! Luckily, the wavefunction encodes exactly what the momentum is in its phase, which does not show up in the probability distribution. A momentum eigenstate (a precisely-known momentum state) wavefunction has the form eikx where k gives the momentum. The k disappears when you find the spacial probability distribution associated with the momentum eigenstate--but it must be there to include the momentum info.
     
  6. Aug 23, 2013 #5

    meBigGuy

    User Avatar
    Gold Member

    Nice explanations.

    I tend to think of complex numbers as being the norm and real numbers as a special case. In electronics they tend to represent phase in one form or another. Anything that has amplitude and phase is easily manipulated or represented in the complex plane. Can one say that QM's momentum and direction are analagous to amplitude and phase?
     
  7. Aug 23, 2013 #6
    Thank you Jolb, that was just what I was looking for! I didn't know (if I'm understanding things right) that momentum had "states", is there a name for that phenomenon so I can look it up some more? It has me curious.

    That's a really neat way of looking at it.

    I'd like to know, that's an interesting thought.
     
  8. Aug 23, 2013 #7
    Actually I should probably start a new thread for this, but how do we even define amplitude and phase in QM? (I know that is a ridiculous question, sorry). I know how we define them in the context of things like water and sound, but how for particles?
     
  9. Aug 23, 2013 #8

    meBigGuy

    User Avatar
    Gold Member

    I was actually asking that question. I really don't know. Maybe I should have said "Can one say that QM's momentum and direction are analagous to *Electronic's* amplitude and phase with respect to being expressed by complex numbers?
     
  10. Aug 23, 2013 #9
    I hope I haven't confused you guys with the terminology "phase"...

    Really the definitions for "amplitude" and "phase" are not too different from that of a boring old wave. For a boring old wave, the amplitude is like the peak-peak difference in the wave.

    Take the wave f(x)=Asin(kx+θ). The amplitude of the wave would be associated with A. If we used the peak-to-peak definition, its amplitude would be 2A since sin(x) would have 1 as its maximum and -1 as its minimum. k is called the wavenumber, and it is proportional to the (spatial [if x is a space coordinate]) frequency and inversely proportional to the wavelength. Now the word "phase" is a little hard for me to define in language, but the concept is easy to grasp through some examples. If we compared the two waves f(x)=Asin(kx+θ1) and f'(x)=Asin(kx+θ2), we would say that there is a phase difference of θ12 between the two waves. [You may want to include a "modulo 2π"] So sin(x) would have a 90° phase difference relative to cos(x), and -sin(x) would have a 180° phase difference relative to sin(x). The "phase" of a wave basically tells you "where along its cycle" the wave is. Notice that phase is typically a relative thing between two waves.

    As a purely mathematical fact, you can easily show eikx = cos(kx) + i sin(kx). So that's very wavelike but it isn't quite like sin(x) or cos(x) because it never passes through zero. As x increases, eikx traces out the unit circle in the complex plane. [This gives us the interesting fact that this wave times its complex conjugate is always the constant 1, regardless of k--as in my previous example.] But we would still say that this wave has wavenumber k, and there would be a 180° phase difference between eikx and ei(kx+π). Basically all the terminology is exactly the same as with sine and cosine: amplitude, wavelength, frequency, etc.

    Now in quantum mechanics it turns out that if eikx is the particle's wavefunction, then the particle has momentum p=[strike]h[/strike]k, where [strike]h[/strike] is the reduced Planck constant. When I say that the "wavefunction encodes the momentum in its phase", I'm being a little vague--I would say "the property of the wavefunction that encodes the momentum is how rapidly the wavefunction's phase changes from one point to the next". If I wanted to be more colorful I could say "The property of the wavefunction that encodes the momentum is how rapidly the wavefunction's position on the unit circle in the complex plane changes from point to point." A particle whose wavefunction stays roughly at the same spot on the unit circle from point to point would have a small momentum, whereas one with a wavefunction that rapidly goes around the complex unit circle would have a high momentum. A really important point, though, is that the phase is not directly observable, and the only way in principle one could observe it is by interfering one wavefunction with another, which only measures the phase difference between the two wavefunctions.

    Really, though, saying all this stuff in words is a little risky. I'm being kind of Michio Kaku-ish in making things sound really cool and weird, but really the better way to learn this is to open a textbook and go through the math. That's the better way to understand it, and it's not nearly as mysterious as it sounds in words.
    No, I wouldn't say that. The analogy doesn't go much further than the fact that complex numbers are easier to work with than equivalent structures like 2x1 matrices. Moreover, in a classical E&M perspective, all the physical quantities describing the electronics are real numbers [since all physical quantities are measurable in classical mechanics], so you'd be led to believe the imaginary numbers you use for electronics are just for convenience--whereas in QM the physical quantities (the wavefunction) are necessarily complex numbers.


    PS: Here is an elegantly worded dictionary definition of "phase" from m-w.com:
     
    Last edited: Aug 23, 2013
  11. Aug 23, 2013 #10
    Well if you learn a little QM math you'll learn that eigenstates of any possible measurement correspond to eigenvectors/eigenfunctions of operators. Let me give a little intro.

    An operator is basically a mathematical object that "acts on" or does something to whatever is sitting to its right. We will denote that a symbol represents an operator by putting a tilde over it, for example [itex]\tilde{o}[/itex]. Let us define a simple example operator: the differential operator [itex]\tilde{D}[/itex]. The definition of the differential operator would be: [tex]\tilde{D}f(x)=\frac{\partial f}{\partial x}[/tex]
    Like the example above, we define all operators implicitly in terms of what they do to objects.

    Now in quantum mechanics, performing a given measurement corresponds with some given operator. Quantum mechanical operators can usually be represented as either a matrix or some sort of derivative. For simplicity, if some operator corresponds to performing the measurement of some [measurable] quantity [itex]q[/itex], then we will denote that operator by [itex]\tilde{q}[/itex]. Eigenstates of a measurement correspond to eigenvectors (if the operator is a matrix operator) or eigenfunctions (if the operator is a differential operator) of that operator. [I explained what an eigenstate is in an earlier post, but recall that it corresponds to a possible outcome of the measurement of the state] The definition of both eigenvector/eigenfunction of an operator is: If an operator acts on a vector/function and the result is a scalar multiple of the original vector/function, then that vector/function is an eigenvector/eigenfunction of that operator. To be more explicit:
    If [itex]\tilde{q}f(x)=cf(x)[/itex] for some constant [itex]c[/itex], then [itex]f(x)[/itex] is defined as an eigenfunction of the operator [itex]\tilde{q}[/itex], and [itex]c[/itex] is defined as the eigenvalue of the eigenfunction [itex]f(x)[/itex]. You can easily generalize this definition to operators acting on vectors and other objects--and that would leave you with a general mathematical definition for what eigen[insert object]s are. They all are associated with an operator and they all come with an eigenvalue.

    Now in quantum mechanics, we find that if we are performing a measurement corresponding to the operator [itex]\tilde{q}[/itex], then the number we measure [itex]q[/itex] will always be the eigenvalue corresponding to whichever eigenstate the system is in. Since this is a number we're actually measuring, it must be a real number. So the eigenvalues of quantum mechanical operators are always real numbers.

    Now to give you an example which shows what a momentum eigenstate is. It turns out that the operator [itex]\tilde{p}[/itex] corresponding to momentum can be defined implicitly according to [itex]\tilde{p}f(x)=-i\hbar\frac{\partial f}{\partial x}[/itex] where the object it acts on [itex]f(x)[/itex] is a wavefunction.

    What would the eigenfunctions of that operator be? For it to be an eigenfunction we must have that [itex]cf(x)=\tilde{p}f(x)=-i\hbar\frac{\partial f}{\partial x}[/itex] for some real constant [itex]c[/itex].
    So we want to solve the differential equation
    [itex]-i\hbar\frac{\partial f}{\partial x}=cf(x)[/itex].
    The solution to this equation is [itex]f(x) = e^{ikx}[/itex]. The [itex]i[/itex] must be in the exponential because we require [itex]c[/itex] to be real. So that's what the eigenstates of momentum are. What are the eigenvalues?[tex]\tilde{p}f(x)=-i\hbar\frac{\partial f}{\partial x}=-i\hbar \frac{\partial}{\partial x}\left[e^{ikx}\right]=-i\hbar(ik)e^{ikx}=\hbar k f(x)[/tex]
    So we have the formula I said before, p=[strike]h[/strike]k
     
    Last edited: Aug 23, 2013
  12. Nov 3, 2013 #11
    Awesome!
     
  13. Nov 3, 2013 #12

    bhobba

    User Avatar
    Science Advisor
    Gold Member

    Deep question.

    First to Jolb - great answers mate - well done.

    There are a few reasons. I will give two.

    Its a very strange but true fact if one takes an equation from advanced classical mechanics called the Hamilton-Jacobi equation and if you allow complex numbers - lo and behold you get Schrodinger's equation. Feynman actually sorted out why this is. Advanced classical mechanics is based on whats called the principle of least action - you can read up on what that is - but for this purpose its a way of figuring out what paths a classical particle can follow. Now it turns out that complex numbers are crucial in deriving that from QM - it simply doesn't work otherwise.

    The second reason is a very deep theorem required to make sense out of symmetries in QM - called Wigner's theorem:
    http://en.wikipedia.org/wiki/Wigner's_theorem

    That theorem is only true if you allow complex numbers.

    Thanks
    Bill
     
    Last edited: Nov 3, 2013
  14. Nov 3, 2013 #13
    Cool! I'll look into those. Thanks!
     
  15. Nov 3, 2013 #14

    bhobba

    User Avatar
    Science Advisor
    Gold Member

    No problem.

    Also check out the following:
    http://arxiv.org/pdf/1204.0653v1.pdf

    I mentioned the need for complex numbers due to the importance of Wigner's Theorem in QM with regard to symmetries.

    Why are symmetries important? Check out:
    http://www.pnas.org/content/93/25/14256.full

    Once you understand that you will realize its quite likely the deepest and most dazzling result modern physics has uncovered - but I will stop there - you need to grasp that for yourself to appreciate its true significance.

    Thanks
    Bill
     
    Last edited: Nov 3, 2013
  16. Nov 4, 2013 #15
    Now I'm excited. Thank you for the links I look forward to reading them!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Intuition behind superposition probabilities?
Loading...