Probabilities in Quantum Mechanics

StevieTNZ
Messages
1,934
Reaction score
873
Hi there,

The average expected result for particles with 1/2 probability going through slit 1 and 1/2 probabiltiy going through 2, for a large number of particles (N) is exactly that: 1/2 slit 1, 1/2 slit 2.

We send the large number of particles through and find that roughly half go through slit 1 and half go through slit 2. But we further send another N number of particles through.

When these sort of predictions get verified, do a further N number of particles get sent through and the statistics stay roughly the same as the first N lot of particles? Is that how the average is verified?

Because couldn't you get different probabilities if you send a further N number of particles through and they deviate away from 1/2 slit 1, 1/2 slit 2?
 
Physics news on Phys.org
Is it that after a finite N number of trials, the probabilities predicted are more-or-less met, and with the law of large numbers, any further trials will still conform to the predicted statistics?
 
Not an expert in QM. AFAIK, Schrödinger's equation is quite different from the classical wave equation. The former is an equation for the dynamics of the state of a (quantum?) system, the latter is an equation for the dynamics of a (classical) degree of freedom. As a matter of fact, Schrödinger's equation is first order in time derivatives, while the classical wave equation is second order. But, AFAIK, Schrödinger's equation is a wave equation; only its interpretation makes it non-classical...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. Towards the end of the first lecture for the Qiskit Global Summer School 2025, Foundations of Quantum Mechanics, Olivia Lanes (Global Lead, Content and Education IBM) stated... Source: https://www.physicsforums.com/insights/quantum-entanglement-is-a-kinematic-fact-not-a-dynamical-effect/ by @RUTA
Is it possible, and fruitful, to use certain conceptual and technical tools from effective field theory (coarse-graining/integrating-out, power-counting, matching, RG) to think about the relationship between the fundamental (quantum) and the emergent (classical), both to account for the quasi-autonomy of the classical level and to quantify residual quantum corrections? By “emergent,” I mean the following: after integrating out fast/irrelevant quantum degrees of freedom (high-energy modes...

Similar threads

Back
Top