dsaun777 said:
I still don't see the what is so bewildering.
You are right, there's nothing "bewildering" about quantum mechanics as soon as you accept the world view that Nature behaves "irreducibly random". In contradistinction to classical physics, quantum theory is an indeterministic description of the phenomena we observe in Nature.
This means you can have complete knowledge about a system, i.e., you can prepare it (in principle) in a pure state. E.g., you can determine the values of a complete set of compatible observables precisely. Then you system must be in a uniquely defined pure state described by a normalized common eigenvector of the corresponding self-adjoint operators describing these observables. This eigenvector ##|\psi \rangle## is determined up to a phase factor, but the state itself is then unique, described then by the statistical operator of the corresponding pure state, which is the projection operator ##\hat{\rho}=|\psi \rangle \langle \psi|##.
This is the best in the sense of knowledge you can get about your system, i.e., the state is completely known in this case in the sense of quantum theory. Now it's clear that this does not imply that you know precisely the values of observables which are not compatible with all the observables taking determined values through this complete preparation of the system. All you can know about the outcomes of measurements of such observables are (usually) the probabilities to get a certain possible value this observable can take (an eigenvalue of the corresponding self-adjoint operator describing it in the quantum-theoretical formalism). If you then measure this observable and filter out all systems which have a certain value, the determination of the before exactly prepared values of the compatible set of operators in general gets lost, i.e., after preparation of the system to have a certain value of the other incompatible observable destroys the preparation of the before determined observables.
This is reflected by the uncertainty relations. The most famous is the one between a position-vector component and the momentum component in the same direction, which says that the standard deviations of these quantities obey the uncertainty relation ##\Delta x_1 \Delta p_1 \geq \hbar/2##. First of all you can never really exactly determine position or momentum, because the operators have continuous eigenvalue spectra. Nevertheless you can, say, localize the particle very precisely, i.e., make ##\Delta x_1## very small (given by the resolution of your particle detector, which is not principally limited, i.e., you can make it as precise as you like modulo the technical problems you have to solve and how much effort you put in), but then necessarily ##\Delta p_1 \geq \hbar/(2 \Delta x_1)## is very large, i.e., the momentum component ##p_1## is necessarily very indetermined. If you know use a "velocity filter" (like crossed electric and magnetic fields) and use only particles with a very well determined velocity (and thus very well determined momentum) you necessarily loose the localization of the particle, because now due to the uncertainty relation the particle's position has a very large standard deviation.
All this is not so mind-boggling except for the fact that we cannot, according to QT, prepare particles in such a way that all observables take simultaneously precise values, and in general we only know probabilities for the outcome of measurements of observables even if we know the state completely in the sense that we have prepared the system in a pure state.
Now entangled states have however a completely surprising property, which Einstein called inseparability. An extreme example, which nowadays can easily be prepared are polarization-entangled photon pairs. You have two photons with quite well defined different momenta ##\vec{k}_1## and ##\vec{k}_2## and polarizations ##h_1## and ##h_2## (##h_1,h_2## standing for helicities for convenience) prepared in a state
$$|\Psi \rangle =N [\hat{a}^{\dagger} (\vec{k}_1,1) \hat{a}^{\dagger}(\vec{k}_2,-1)-\hat{a}^{\dagger} (\vec{k}_1,-1) \hat{a}^{\dagger}(\vec{k}_2,1)]|\Omega \rangle,$$
where ##\hat{a}^{\dagger}(\vec{k},h)## is the creation operator for a photon with momentum ##\vec{k}## and helicity ##h## (with a certain uncertainty in momentum such that we have a normalized state rather than generalized exact momentum eigenstates, i.e., we can choose ##N## such that ##\langle \Psi|\psi \rangle=1##).
Now the photons fly appart after being prepared in this state (e.g., by a process known as "parametric downversion", where a photon from a laser is interacting with a certain kind of birefringent crystal such that it gets split in a photon pair in a state of the above given type). Now wait a while and put detectors at far-distant places ##A## and ##B## (standing for the corresponding observers Alice and Bob) where Alice and Bob measure the polarization state of their photon. It is clear that the polarization of either of these single photons is not determined, because it can be either ##h=1## or ##h=-1##. QT tells us (by calculating what's known as the reduced state) that indeed the polarization of each of A's or B's photon prepared in the above state is completely unknown, i.e., all Alice and Bob measure when getting a lot such prepared photons is that the photons are precisely unpolarized, i.e., the polarization state of each of the photons is the maximally uncertain unpolarized state described by the statistical operators
$$\hat{\rho}_A=\hat{\rho}_B=\frac{1}{2} \hat{1}.$$
But know Alice and Bob can do coincidence measurements, i.e., they can use clocks to make sure that they always measure the polarizations of their photons always from a the photon pairs prepared in this specific entangled state. Then you can immidiately see that if Alice measures the polarization ##h_A=+1## (which happens with 50% probability) then Bob necessarily finds ##h_B=-1## and vice versa.
This is mind-boggling for some physicists, who believe in certain interpretations of quantum theory. I'm a proponent of the ensemble interpretation (the interpretation which Einstein preferred, in saying that quantum theory only makes predictions concerning the probabilities for the outcome of measurements which can be checked only by preparing an ensemble of to be measured systems in a certain way described by the quantum state, and which was formulated most clearly by Ballentine in the earily 1970ies), and thus have no quibbles with this feature of quantum theory (today established with very high precision by experiment), because accepting this purely probabilistic interpretation of the quantum state the complete indetermination of the single-photon polarizations with 100% correlation of the outcome of measurements by Alice and Bob when measuring the same polarization state (in my example by measuring the helicities of the photons) is due to the state preparation in the very beginning.
If you however follow a certain kind of Copenhagen interpretation, including a socalled "collapse of the state" you get into trouble with relativistic causility, because within this preparation the polarization measurement of A's photon instantaneously causes (sic!) B's photon to collapse into a new state such that its polarization gets determined in the opposite helicity than that measured by Alice for her photon, but the two photons when measured by Alice and Bob can be arbitrarily distant when registered. So an instantaneous collapse then massively violates Einstein causality, because the information from A's measurement must arrive at B's photon faster than light, and this is impossible due to relativity (and photons are to be described relativistically, because if there's anything relativistic it's photons!).
However, if you give up these collapse assumption and just accept the probalistic interpretation the two photon's already carried the correlation between the outcome of measurements by A and B with them due to the preparation in this entangled state though the individual outcome of the measurement of the single-photon polarizations are completely random.
While Einstein believed that QT is incomplete and that there must be "hidden variables" which predetermine the outcome of each single-photon polarization measurement, thanks to the work by Bell we can test this assumption, at least if you assume locality, i.e., that the measurements on the photon are due to local interactions of the photons with the measurement devices (like polarizers and photon detectors at Alice's and Bob's places), because Bell could show that a deterministic (often called "realistic") local hidden-variable theory implies uncertainties for the mutual outcome of certain polarization measurements on any photon pair fulfilling an inequality of certain correlation measures for the probability distributions over the determined but onknown hidden variables, which contradict these correlation measures when calculated from standard quantum theory when the photon pairs are prepared in such a completely entangled state as discussed above (such states are thus called "Bell states"), i.e., quantum mechanics contradicts measurably the assumption that there is a local deterministic hidden-variable theory. Such "Bell tests", including some avoiding all the suggested possible "loop holes", with an amazing statistical significance confirmed the predictions of quantum mechanics, i.e., the violation of Bell's inequality.
For me that's a clear that, no matter which interpretation beyond the minimal statistical (or ensemble interpretation) you might follow, at least local deterministic hidden-variable theories are not way to make in any way quantum theory "more complete". In fact so far no such satisfactory local deterministic hidden-variable theory is known. Also I don't think that Bohmian mechanics as a non-local deterministic theory is very convincing as far as relativistic quantum field theory is concerned (though it's consistently formulated for non-relativistic quantum mechanics). That's why I think that quantum theory is by far more complete than Einstein thought.
The real problem with quantum mechanics in my opinion are not all these socalled "interpretational problems" which upset Einstein so much, but that we have no satisfactory quantum description of the gravitational interaction since the standard quantization prescriptions working so well for the other fundamental interactions in Nature (the Standard Model of elementary particles, despite all its own problems, is amazingly robust against all efforts to disprove it; just these days a new accurate determination of the fine structure constant ##\alpha_{\text{em}}## confirmed the Standard Model once more) don't work for the gravitational field (aka General Relativity).