Fredrik said:
I disagree. A measuring device (an idealized one) only interacts with the system during the actual measurement, and the measurement is performed on the last state the system was in before the interaction with the measuring device began. In this case, we're clearly performing the measurement on the state that was prepared by the slit, so it can't be considered part of the momentum measuring device. The momentum measuring device consists of the wall of detectors and any computer or whatever that calculates and displays the momentum that we're going to call "the result". The coordinates and size of the slit will of course be a part of that calculation, but those are just numbers typed manually into the computer. Those numbers are part of the measuring device, but the slit isn't physically a part of it.
This is where I think we either disagree or aren't trying to do the same thing. Normally I agree with you, ie. if all we are doing is measuring position at the plate. Then I agree.
But I though the whole point here is that we are trying to generalize some kind of "measurement" as an inference, from the picture outlined in Ballentine. And in THIS case, since as you acknowledge below, we really have an "average" throughout the construct, then this has to be respected by the y measurement as well, other wise we are IMO not inferring y and p_y from the same information - thus the comparasion of uncertainties make no sense at all.
Fredrik said:
I've been talking about how to define a momentum measurement on a state with a sharply defined position, but now that I think about it again, I'm not sure that even makes sense.
Mmm ok. Then we were trying to accomplish different things. I don't think this makes sense either. I mean, sure we could come up with some type of calculation of dy and dp, but in the way you seek it I think it would not correspond to the same information state (see below).
Fredrik said:
Huh? What's an information state? Are you even talking about quantum mechanics?
Yes, but in a generalized sense (as you were the one seeking to define new measurements).
I just mean that wavefunction gives a part classical flavour. I think more in terms of an abstracte state vector (which is of course suppsedly encoding the same info as the wavefunction) but interpreted differently from ballentines stat int.
The interpretation is that, instead of a thinking of the state vectors as encoding information about statistical ensemble, realized as an infinity of identical prepared systems etc, I'm thinking of the observers state of information/knowledge about the system.
Technically this is not a properpy of the system, it's a property of hte STATE of the observing system. Only at equilibrium, does the state of the observers information about the system match at least in some sense the system. The point is that this interpretation allows understanding the concept of information state, even when no ensemble can be realized, or when the information that "should need to go into the ensemble" must be truncated simply because the observing system is NOT and infinite environment serving as informaiton sink, but rather a finite mass subsystem of the universe.
But it's not news that my interpretation of QM, is not at all like Ballentines statistical view.
Fredrik said:
What we need here is a definition of a "momentum measurement" on the state the particle is in immediately before it's detected, and the only argument I can think of against Ballentine's method being the only correct one is that classically, it would measure the average momentum of the journey from the slit to the detector. However, classically, there's no difference between "momentum" and "average momentum" when the particle is free, as it is here. I don't see a reason to think this is different in the quantum world, so I no longer have a reason to think we're measuring "the wrong thing", and that means I can no longer argue for a second contribution to the total error that comes from "measuring the wrong thing". (That was the contribution I said would grow with L).
This sounds like the objection I have too.
I phrased it differently ,but the objection is similar. What we do infer is the momentum "spread out" over the time from where the information used for inference originates. This is why this is also the "time stamp" for any y measurement we want to "associate" to the same inforation. Ie. this is why I ague for the extra uncertaint in y. It's not because the error at the screen is larger than the detector cell, but because we are force to add this error if we insist on associating it with the infered p_y average.
So instead of saying "we infer the wrong thing", I took that whatever we measured as the starting point, given ballentimes scheme, and then suggested that to make it a coherent inferent we need to adjust the y-inference as well and add the error.
When you look at what information, that is used for an inference. This becomes more clear. The only "time stamp" we have are parameterizations of how the information set evolves. An intrinsic comparasion must work on the same information set (corresponding to the generalisation of conjugate at same time).
Atyy's points has been similar althjough I didn't read all the quoted papers, except the way you put it depends on your interpretation.
/Fredrik