I definitely get your point, but I still would like to add this subtle note:
Inferences makes use of inference rules, priors, new information (input) to produce an output(a posterior), quite like a "calculation".
So even merging RAW data from a detector requires equivalents of "calculation" as it relies on priors and inference rules.
Demystifier said:
The point is that in this experiment p_y is NOT MEASURED but CALCULATED. Measurement and calculation are not the same. A calculation always contains an additional theoretical assumption which a true measurement does not need to use.
This leads me to question the notion of "true measurents". Are there any true measurements that does note rely on prior "assumptions" (read prior state + current state of inferenece rules). I don't think so, becaause even the simple act of ENCODING, PROCESSING and STORING the raw sequencial information about detector hits requires processing that is on par with "calculations".
I'm arguing for a much more extreme version of what is hinted by others... for example in smolins principle of relativity paper(
http://arxiv.org/abs/1101.0931) he expresses the idea thta there is no direct observation of spacetime events, they say "The idea that we live in a spacetime is constructed by inference from our measurements of momenta and energy"
I think what they hint, should be taken to more extremes. The question then is thta it seems that some "calculations" (inference rules) are favoured by nature, while some aren't. Thea idea would then be that there is a different betwen a random ad hoc calculations containing wild and irrational assumptions, and the kind of "rational inference" that you would expect for example from a "rational player" that recalculates his odds according to probability theory. Here his "calculation" is uniquely determined by the same conditions that singles out probability as a unique system for manipulating degres of beleifs in a rational way - in the generalizeds sense he then "measures" the odds!
http://arxiv.org/abs/0908.3212
http://bayes.wustl.edu/etj/reviews/review.cox.pdf
Ariels idea is that the laws of physics - corresponding to the "deductive machinery" we use for "calculations" and that also nature ITSELF uses for interactions follow from uinque rules of rational inference.
Thus my take on the example of this was - if we take the measuremnt of y and p_y to be a form of generla inference (rather than specific QM "measurement" on a well define regular STATE), what is the correct inference to use (calculcation) and what is the generalization of STATE we need when we are mixing information from a complex apparatous rather than a pointlike detector?
My suggestion was that in the state represented by combining information and inference in the way in ballentines example should include the L tan theta as uncertainty of y - if we do, HUP still holds if we also add that the uncertainty of the position of the particle hit is never more accurate than say the order of it's broglie wavelenght no matter if the actual sensor small is pointlike.
So if we require that the "calculation" follows some kind of rational inference, then I think it would also qualify as a generalized measurement (not measurmeent in the sense of obvserving a detector hit, but measurement as "infering an information state")
Then I don't think one needs to assume that the particle follows a certain path between slit and screen, because the information just tells us that we have initial and final points, and all we can infer is an expectation of some "average path", but then also average position of y during hte process.
This isn't formal arguments and I think that to make it fully formal one would need to work out this generalized formalism. But intuitively I think the concept of generalized measurements makes sense and if QM can't handle it, then I think we need to improve it.
/Fredrik