An ideal voltmeter "sucks out" zero current. Its infinite resistance would prevent any current from flowing through it, and it would measure an exact voltage difference (potential) between its two leads. Because a real world voltmeter cannot have infinite resistance, some of the current "leaks" across it, and the measured voltage will drop without some other current source (e.g., a battery) attached to the circuit.
So, if you have charged a capacitor, then disconnected it from the current source, and then measure its voltage with a real voltmeter, the capacitor will "slowly" lose some of it charge - the capacitor in effect becomes the only current source in the circuit. Were the voltmeter ideal, it would prevent any current flow, and simply measure the potential difference between the two leads of the capacitor.
Hope that helps.
[edit]
Perhaps an analogy will help. Consider measuring the gravitational potential energy of a rock on the edge of a cliff of a certain height (analogue of electric potential energy, or voltage) . The "ideal" potential energy measuring device just measures the height of the cliff, without moving the rock. A "real world" device would allow the rock to drop ever so slightly to make its measurement, and if it were kept there, the rock would continue to fall until its potential energy were 0, i.e., it hit the ground. Obviously, a real world device for measuring gravitational potential energy doesn't have the same limitations as a voltmeter - a measuring stick tells you exactly the height of the rock without moving it. In the world of electrics, the device has to touch the rock, providing a ramp for it to roll down. The current that "rolls down" through the voltmeter isn't special any more than the motion of the rock in the hypothetical gravitational potential energy measuring device. In fact, it's unwanted and ultimately results in all of the potential energy being expended - as kinetic energy (down the ramp) for the rock and as current (through the voltmeter) for the capacitor.