Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Matrix Mechanics and non-linear least squares analogy?

  1. Mar 2, 2017 #1
    I have some experience with non-linear least squares curve fitting. For instance, if I want to fit a Gaussian curve to a set of data, I would use a non-linear least squares technique. A "model" matrix is implemented and combined with the observed data. The solution is found by applying well defined methods of Linear Algebra.

    Here is a link to a non-linear least squares method I have implemented many times in the past.
    http://mathworld.wolfram.com/NonlinearLeastSquaresFitting.html

    I bring this up because I am trying to understand how Matrix Mechanics is used in quantum physics. The above curve-fitting example is the closest I can come, from my own experience, to understanding what might be going on.

    Does matrix mechanics depend on the observed results? Is Matrix Mechanics, as used in the quantum world, fitting the observed results to a "model" matrix of some kind? (density/scatter matrix?) And the eigenvalues of the combined system(observed results + model matrix) are the solutions?

    Is this a possible analogy? I am not looking to be right, I am just trying to figure out how Matrix Mechanics is actually used. Does it operate on a "wave function". And if so, where does that "wave-function" come from? It is my understanding that Matrix Mechanics and Schrodinger Equation are completely different , but equivalent, methods, but , to my understanding, only the Schrodinger Equation makes use of a wave equation.
     
    Last edited: Mar 2, 2017
  2. jcsd
  3. Mar 7, 2017 #2

    DrClaude

    User Avatar

    Staff: Mentor

    I am not completely familiar what the history of quantum mechanics, but I don't think the analogy of "fitting" is the right one to understand the reasoning of Heisenberg. I think that Wikipedia has a pretty good description of that: https://en.wikipedia.org/wiki/Matrix_mechanics

    As for the modern usage of matrices, it can be understood simply by considering a discrete complete basis set ##\{ \phi_i \}##, such as the eigenstates of the Hamiltonian,
    $$
    \hat{H} \phi_i = E_i \phi_i
    $$
    Then the sated of the system can be expressed exactly as
    $$
    \psi = \sum_i c_i \phi_i
    $$
    Taking the basis set to be orthonormal, then one gets the coefficients using
    $$
    \int \phi_i^* \psi d\tau = \sum_j c_j \int \phi_i^* \phi_j d \tau = c_i
    $$
    The wave function can then be represented simply by the set of coefficients ##\{ c_i \}##.

    Likewise, one can calculate for any operator ##\hat{A}##
    $$
    A_{ij} = \int \phi_i^* \hat{A} \phi_j d \tau
    $$
    The values ##A_{ij}## can be taken as the elements of a matrix, and likewise ##\psi## can be expressed as a column vector of the coefficients ##\{ c_i \}##. It is then easy to show that the action of the operator on the state, ##\hat{A} \psi##, is equivalently described by the matrix-vector product ##\mathrm{A} \bar{\psi}##.
     
  4. Mar 7, 2017 #3
    When you perform a quantum experiment do you know the eigenstates of the Hamiltonian BEFORE you do the experiment or do the results of the experiment determine the eigenstates? I thought the eigenstates were the unknows that are to be determined.
     
  5. Mar 7, 2017 #4

    DrClaude

    User Avatar

    Staff: Mentor

    The eigenstates are given by the Hamiltonian, hence by the characteristics of the physical system (and the relevant environment). The state of the system can be known or unknown, depending on how it is prepared.

    Take for instance the Stern-Gerlach experiment. The relevant eigenstates are the spin-up and spin-down state for the atom. For a given atom, its state can be known (for example, because the atoms are coming out from another Stern-Gerlach apparatus before coming in the SG apparatus that we are analyzing) or unknown (thermal source of atoms). In the latter case, the density operator formalism is required.
     
  6. Mar 7, 2017 #5
    How do you get the eigenstates of the Hamiltonian?

    So what purpose is the Schrodinger Equation if you already know the eigenstates and I suppose the wave function?
     
  7. Mar 7, 2017 #6

    Nugatory

    User Avatar

    Staff: Mentor

    1) After we prepare a system in a given state (which may or may not be an eigenstate of the Hamiltonian or any other observable) it doesn't (in general) stay in that state; Schrodinger's equation allows us to calculate what state it will be in at any given time in the future. Look at the time-dependent form of the equation to see how this works.
    2) We may be able to measure the energy, but until we've solved Schrodinger's equation for the system we won't know the wavefunction that corresponds to that energy. An example would be electron orbitals; we knew the energy levels of the hydrogen atom many years before we knew solutions to Schrodinger's equation that had those energy levels as eigenvalues.
     
  8. Mar 7, 2017 #7
    I do not know why it took me so long to understand this. I think what you are saying and what the Eigenfunction nature of the problem is saying, is that , the system can only exist in one of the predefined states exposed by the Hamiltonian. (or in a linear combination of those predefined states). While the mathematics belie what I am about to say, I think the eigenfunction dependency of the quantum world actually makes things easier to analyze and not harder(assuming you are well versed in the mathematics which I am currently not!) I say this, because once you find the wave function for any allowed state, it should be easy to find the wave functions of all the allowed states because the they are all just scaler multiples of the same base function?(eigenfunctions) Or do I have this messed up as well? ( I have a tendency to jump to conclusions, which, in retrospect are somewhat true, but not exactly true)
     
  9. Mar 7, 2017 #8

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Usually (generally/always?) the solutions of the SE (eigenfunctions of the Hamiltonian) form a basis and any state can be expressed as a linear combination of the eigenfunctions. So that only isn't a restriction.
     
  10. Mar 7, 2017 #9
    So how do you get the eigenfunctions of the Hamiltonian operator? At some point the results of the experiment must come into play.
     
  11. Mar 7, 2017 #10

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Hehe, you solve ##H\Psi = E\Psi## :smile:

    [edit] I realize I'm repeating drClaude... sorry.
     
  12. Mar 7, 2017 #11

    Nugatory

    User Avatar

    Staff: Mentor

    Ah, no...... That is most emphatically not how it works. The energy eigenstates (and any other eigenbasis from any other operator) provide a basis I can use to describe the state, rather as I can describe which way the wind is blowing after I've chosen a "north/south" and "east/west" basis. However, choosing that basis doesn't mean that the wind can only blow in those two directions: ##5\hat{N}-5\hat{E}## is a moderate breeze blowing towards the northwest.
    Nope. The allowed states can be written as sums of scalar multiples of the different eigenfunctions, just as the allowed wind speed vectors can be written as sums of scalar multiples of the different base vectors ##\hat{N}## and ##\hat{E}##.
     
  13. Mar 7, 2017 #12
    Thank you as always. I do understand that an allowed state can be a linear combination of the basis states.

    Where does the actual results of an experiment get incorporated into the solution process? Are they incorporated, in some way, into the hamiltonian operator?
     
  14. Mar 7, 2017 #13
    I got it now.

    It is a eigenvalue problem, Hx=λx, or (H-λ)x = 0.

    All we have to do is find the eigenvalues of the above equation and their associated eigenvectors. The eigenvectors are the basis for the wave functions. (I do realize that finding eigenvalues of large matrices is never an easy process, as in finding roots of polynomials)

    The key to the entire process must be in setting up the Hamiltonian operator.

    Here is a link to a paper describing the unbelievably complex process of setting up the Hamiltonian for an electron in an electromagnetic field.

    http://folk.uio.no/helgaker/talks/Hamiltonian.pdf
     
    Last edited: Mar 7, 2017
  15. Mar 7, 2017 #14

    DrClaude

    User Avatar

    Staff: Mentor

    Comparison of the results of QM can be compared to experiments, checking that all necessary elements are included in the Hamiltonian.


    Yes. This is no different than classical analytical mechanics, where one sets up the Hamiltonian (or Lagrangian) for the desired physical system.

    That's at a higher level than the overall discussion in this thread.
     
  16. Mar 8, 2017 #15
    I still do not have this right.

    As far as I know, we cannot find the eigenvalues of the matrix, H, because the Hamiltonian involves the gradient operator, ∇, which must operate on the unknown vector x. There is no way to complete the matrix, H. To find the eigenvalues of H, all elements of H must be resolved into complex numbers, there can be no unresolved terms, such as ∇.
     
  17. Mar 8, 2017 #16

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    You're mixing up the Heisenberg and Schrodinger representations. H is a matrix in the Heisenberg representation, but in that representation, there is no "gradient operator" ##\nabla##, and the matrix H does not contain it. Matrices operate on vectors, not functions, and ##\nabla## operates on functions.

    In the Schrodinger representation, H is not a matrix, it's a differential operator, and contains the operator ##\nabla##; but H operates on functions, not vectors, and so the eigenvalue problem is the problem of finding functions ##\psi(x)## that obey the eigenvalue equation ##H \psi(x) = \lambda \psi(x)##. This is just solving the appropriate differential equation.
     
  18. Mar 8, 2017 #17
    Thanks.
    Can you please tell me what is the equivalent matrix equation in the Heisenberg representation?

    (I do know that ∇ operates on functions, but I was under the impression that the elements of the vector X could be functions)
     
  19. Mar 8, 2017 #18

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    It's ##H \vert \psi \rangle = \lambda \vert \psi \rangle##, where ##H## is the Hamiltonian matrix and ##\vert \psi \rangle## is a state vector.

    No, they're complex numbers. But there might be an infinite set of them for each vector, depending on the Hilbert space in question.
     
  20. Mar 8, 2017 #19
    Thank you for that. I found a Feynman lecture on the Hamiltonian Matrix. I would like to post a link to it for future reference.

    http://www.feynmanlectures.caltech.edu/III_08.html
     
  21. Mar 9, 2017 #20
    It turns out that the gradient operator is defined for kets, |Ψ>.

    https://lh3.googleusercontent.com/3eK6B3aShOfxJlGrPGyrX4WsRcI_G9NYinPhTq17ZyjS-maaoArohnlE6QrPCvxvK8qUYeEgEufbTO6KY9Lp0tMUoNDwwe7vhW-fTys1OieTRUVWHOazFVb3LGAxHPNobFWmlXHQpBlrslCl-XnwVtijrAOu6H8setGDXAhXNK_eeFyUaDIPYwN3J0uw1Qy-ARW43rFbQEDWIfqw9KEeYm7WgakxUGbN0hGJcWl2AUU2yeZDoVK0L9O03vE1b9s4Qfy-ds93mxtLK9vt_o9_7KE1IKwFBNP-8xeoyj6OCc7tRCiy3pyGHka8hWR2AnCqFn81eTQdfvC2fSsohdiHTEsTS7Myk-GFCfE9xrtTeT8uEnpQiWa8TVWyqOhAmIWHzQcevHctR2IChJPTyjXZ9osgffT7MLS83RIymsGy4afo41fDL7k3DvexzikRzGdgBaeMa6zUofmYl3pWK10dlapPSINCWa7h-J5XCnhj3bmUcIAjNISFzyv2lku8sYHWEs03lBOae0cAqmMFv2h_jwo4hxWtdwnDmg2xhGYF48iWScZcRNneXelG7KUOwWYYpqZ_bgviwItf3MXm9mwgBbxODuNoCZxnoWiknP84J4SOEoGIdkqf=w612-h767-no
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Matrix Mechanics and non-linear least squares analogy?
  1. Matrix mechanics (Replies: 1)

  2. Matrix mechanics (Replies: 2)

  3. Matrix mechanics (Replies: 4)

  4. Non-local and non-linear (Replies: 12)

  5. Matrix Mechanics (Replies: 3)

Loading...