For those of you familiar with numerical modelling of various phenomena, you will know about work like the various discretisation schemes, stability/gradient limiters for high order schemes and so on. The most broad sweeping improvement to the field of numerical modelling would ultimately be computational power, which would allow a either a more refined model/mesh calculated in the same time or the same one calculated in less time. Going forward a several years to decades (depending if Moore's Law holds), when the computational power available is such that the finest mesh spacings -- at a quantum scale -- can be modelled, how will this be done? Take a simple backward-biased first order discretisation of the linear advection equation for a fluid in a channel. Setting up the mesh is pretty straightforward, and the process is entirely deterministic. If the mesh were to be so fine that quantum effects became involved, the numerical model is now probabilistic. On a large scale, however, it has to be deterministic (talk about a computational analogue of Schrodinger's Cat ). Any ideas on how one would go about setting up the computation? Would the control volume approach be completely inadequate because the material being modelled is no longer continuous? Or will probabilities be sufficient to suitably render the CV properties as 'continuous'? Some may point out that the current level of discretisation for the above problem is more than adequate, with no quantum scale discretisation necessary. But the problem above is just a simple example. Materials modelling at a quantum scale or CFD of rarefied gases come to mind.