- #1
unchained1978
- 93
- 0
I'm curious as to whether or not there is a connection to be drawn between the phenomenon of wave function collapse and the idea of Bayesian inference. I began thinking about this within the context of one of the variants of the Monty Hall problem. If you have one kid, what's the probability that you will have a girl, given that the first is a boy. Before you learned that the first was a boy, the probability that the kid would be a girl is just .5. We could treat that probability as a sort of probabilistic wavefunction (loosely speaking). Now when we learn that one child is a boy, the probability of the next being a girl changes to 2/3. (BB BG GB GG) The last option is now excluded given the new information, giving rise to a new "wavefunction" which effectively collapses the previous wavefunction. It seems strikingly similar to a particle in a box, in which the observation of a rightward moving particle collapses the wavefunction and vice versa. Any thoughts?