# Insights Struggles With the Continuum - Part 1 - Comments

1. Sep 1, 2015

### john baez

Last edited: Sep 6, 2016
2. Sep 1, 2015

### Stephen Tashi

I'm curious whether the continuum model creates a fundamental problem in applying probability theory to physics.

Although applied mathematics often deals with probabilistic problems where events "actually" happen there is no notion of events "actually" happening in the formal mathematical theory of probability. The closest notion to an "actual event" in that theory is the notion of "given" that is employed in the phrase defining conditional probability. However the definition of conditional probability defines a probability, it doesn't define the "given-ness" of an event anymore than the mathematical definition of limit gives a specific meaning to the word "approaches".

When we assume a sample has been taken from a normal distribution, we have assumed an event with probability zero has occurred. This brings up familiar discussions about whether events with probability zero are "impossible". The apparent paradox is often resolved by saying human beings cannot take infinitely precise sample values from a normal distribution. Thus our "actual" samples are effectively a sample from some discrete distribution. However, can Nature take infinitely precise samples from a distribution on a continuum?

3. Sep 1, 2015

### john baez

Good questions!

Clearly events with probability zero cannot be considered "impossible" if we allow probability distributions on the real line where each individual point has measure zero, yet the outcome must be some point or other. The sense of "paradox" one might feel about this can also be raised by this question: "how can a region of nonzero volume be made of points, each of which has volume zero?"

Mathematically these two problems are the same. In measure theory we deal with them by saying that measures are countably additive, but not additive for uncountable disjoint unions: so, an uncountable union of points having zero measure can have nonzero measure.

In short, there is no real paradox here, even though someone with finitist leanings may find this upsetting, and perhaps even rightfully so.

As for events "actually happening", and how this concept doesn't appear in probability theory, I'll just say that the corresponding problem in quantum theory has led to endless discussion, with some people taking the many-worlds view that "everything possible happens in some world", others claiming that's nonsense, and others saying that probabilities should be interpreted in a purely Bayesian way. I don't think this is about the continuum: it's about what we mean by probability.

4. Sep 1, 2015

### Telemachus

Wow, didn't know this guy posted on this forum! great insight btw. So many things we don't know about the nature of space time which rely in the deepest concepts of all physics. I saw some of your videos about your favourite numbers, you are great sir.

Regards.

5. Sep 2, 2015

### Urs Schreiber

The most subtle aspect of the question is the "coordination" step (http://ncatlab.org/nlab/show/coordination) that takes one from a mathematical formalization of some physics to the corresponding perception of concious observers living in that physical world. It is not a priori clear that if a mathematical theory models space(time) as a continuum, that this is also what observers in this world will perceive.

Take the example of phase space. Common lore has it that after quantization it turns from a continuous smooth manifold into a non-commutative space. This assertion is what much of the effort towards noncommutative models of spacetime takes its motivation from. But it is not unambiguously true!

Namely the folklore about phase space becoming a nononcommutative space draws from one of two available formalizations of quantization, which is algebraic deformation quantization (http://ncatlab.org/nlab/show/deformation+quantization). There is another formalization, which may be argued to be more fundamental, as it is less tied to the perturbative regime: this is geometric quantization (http://ncatlab.org/nlab/show/geometric+quantization) (John B. knows all this, of course, I am including references only for this comment to be somewhat self-contained just for the sake of other readers).

Now in geometric quantization there is no real sense in which phase space becomes a non-commutative space. It remains perfecty a continuous smooth manifold (infinite-dimensional, in general). What appears as a non-commutative deformation of the continuum in deformation quantization is in geometric quantization instead simply the action of a nonabelian group of Hamiltonian symmetries (conserved Noether currents, in field theory) on the continuous smooth phase space. And yet, where their realm of definition overlaps, both these quantization procedures describe the same reality, the same standard quantum world.

That is incidentally what the infinity-topos perspective thankfully mentioned in the article above, makes use of: geometric quantization, as the name suggests, is intrinsically differential geometric, and as such lends itself to be formulated in the higher differential geometry embodied by cohesive infinity-toposes.

Last edited: Sep 2, 2015
6. Sep 2, 2015

### eltodesukane

It is kind of strange that the (probably) discrete spacetime is approximated by continuous mathematics (Real Numbers), which are usually computed using a discrete floating point datatype.

7. Sep 2, 2015

### Staff: Mentor

That is indeed an amusing irony... But also a fairly new one. Before the digital computer, before William Kahan and the IEEE floating point standard (which none of us appreciate as much as we should), there was the slide rule.

8. Sep 3, 2015

### Garrulo

But, all the dilemmas that you propose ¿can´t be disappear with quantum mechanics?

9. Sep 3, 2015

### Stephen Tashi

Yes, if we leave out the question of "actuality" then what's left is handled by measure theory - in physical terms, its the same as dealing with phenomena that the mass of an object "at a point" is zero and yet the whole object has a nonzero mass. We can define "mass density" and say the "mass density" at a point in the object is nonzero.

However, if we want to experimentally determine (i.e. estimate) the mass density of an object at a point there is no theoretical problem in imagining that we take a "small" chunk of the object around the point, measure its mass and divide the mass by the chunk's volume.

However, suppose we try to apply a similar technique to estimating the value of a probability density function at a point in a continuum. The natural theoretical approach is that we take zillions of samples from the distribution, we see what fraction of the samples lie within a small interval that contains the point, and we divide the fraction by the length of the interval. But this theoretical approach assumes we can "actually" take samples with exact values in the first place. By contrast, the estimation of mass density didn't require assuming we could directly measure values associated with single points.

10. Sep 4, 2015

### john baez

Thanks! I've posted before in the "Beyond the Standard Model" section, but Physics Forums came after the days when I enjoyed discussing physics endlessly on online forums - if you look at old sci.physics.research posts, you'll see me in my heyday.

11. Sep 4, 2015

### john baez

The last sentence of my post was:

Last edited: Sep 4, 2015
12. Sep 4, 2015

### Xsnac

13. Sep 9, 2015

### Lord Crc

The lack of solutions for 5 or more particles isn't related to the Abel–Ruffini theorem? Or is it just a (temporary) coincidence?

14. Sep 9, 2015

### john baez

Probably nothing to do with the unsolvability of the quintic. Right now the situation is this:
• With 5 particles we know it's possible for Newtonian point particles interacting by gravity to shoot off to infinity in finite time without colliding.
• With 4 particles nobody knows.
• With 3 or fewer particles it's impossible.
and:
• With 3 or fewer particles, we know the motion of the particles will be well-defined for all times "with probability 1" (i.e., except on a set of measure zero).
• For 4 or more particles nobody knows.

Last edited: Sep 9, 2015
15. Sep 9, 2015

### Anama Skout

Great article again! I have a question: In Newtonian physics, if we want to model the collision of two particles why don't we also assume that as the particles approach another force comes into play (namely the EM force) since as the particles approach they will never touch each other, and instead they will repel each other due to EM? That would - as far as I know - mean that the force wont be infinite.

(Loosy analogy: Imagine two rocks getting attracted to each other, as they get more and more close the electrons in the surface of each will repel each other...)

Last edited: Sep 9, 2015
16. Sep 9, 2015

### john baez

If the particles have the same charge, they will repel. And for electrons, this repulsive force will always exceed the force of gravity, by a factor of roughly 1036. You see, since both the electrostatic force and gravity obey an inverse square law, it doesn't matter how far the electrons are: the repulsive force always wins, in this case.

But if the particles have opposite charges, the electrostatic force will make them attract even more!

17. Sep 9, 2015

### Anama Skout

But even if they have a neutral charge, since there will always be a cloud of electrons in each they will eventually repel.

Ah yes! Didn't think about that case.

18. Sep 9, 2015

### john baez

It sounds like you're talking about quantum mechanics now, maybe: a neutral atom with a bunch of electrons? We were talking about Newtonian mechanics... but the quantum situation is discussed in my next post, and quantum mechanics changes everything.

19. Sep 9, 2015

### Stephen Tashi

When we have a function of several variables and take a limit as these variables approach values, there are situations in mathematics where the order of taking the limit matters. It's also possible to define limits where some relation is enforced between two variables (e.g. let x and y approach zero but insist that y = 2x).

In the case of results for point particles, I'm curious if any conclusions change if we first solve a dynamics problem for objects of a finite size and then take the limit of the solution as the particle size approaches zero (instead of beginning with point particles at the outset). Of course, "limit of the solution" is a more complicated concept that "limit of a real valued function". It would involve the concept of a sequence of trajectories approaching a limiting trajectory.

One possibility is that ambiguity might increase. For example, if we have two spherical particles X and Y and take the limit of the solution under the condition that the diameter of Y is always twice the diameter of X as both diameters approach zero then is it possible we could get a different solution than if take the limit with the condition that both have the same diameter? Perhaps we would get different results using oblong shaped objects than spherical objects. If ambiguity increases then this suggests that the concept of a point particle as a limit of a sequence of finite particles is ambiguous.

20. Sep 9, 2015

### Lord Crc

Thanks. I figured it was a long shot, but the numbers stood out so was worth asking. Every now and then there are some deep and surprising connections popping up.

21. Sep 9, 2015

### john baez

For point particles interating by Newtonian gravity, I'm not aware of any calculations like this. For both collisions and noncollision singularities, the distance between certain pairs of particles becomes arbitrarily close to zero as $t \to t_0$, where $t_0$ is the moment of disaster. So, if you made them finite-sized, they'd bump into each other and you'd have to decide what they did then. This seems more difficult than helpful.

On the other hand, for understanding a single relativistic charged particle interacting with the electromagnetic field, a lot of famous physicists have done a lot of calculations where they assume it's a small sphere and then (sometimes) take the limit where the radius goes to zero. I'll talk about that in Part 3.

22. Sep 9, 2015

### atyy

If one describes the Goedel incompleteness theorems as dark clouds, doesn't that in some sense favour the continuum since although the natural numbers and their arithmetic isn't decidable, the theory of real closed fields is decidable (according to Wikipedia)? https://en.wikipedia.org/wiki/Decidability_of_first-order_theories_of_the_real_numbers

Edit: Is finite dimensional quantum mechanics decidable? QM has a "continuum" part which is the "complex" and a discrete part which is the Hilbert space dimension.

Last edited: Sep 10, 2015
23. Sep 10, 2015

### Stephen Tashi

With the right mathematical definition of convergence, we wouldn't have to worry about the result of a collision of finite objects. Thinking of a sequence of trajectories as a sequence of functions, we don't have insist on "uniform convergence" to a limit. Pointwise convergence would do. In other words , suppose trajectory f[n] is close to the limiting trajectory for times less than collision time tzero[n] and f[n] is arbitrarily defined for times greater than tzero[n] ( and perhaps not close to the limiting trajectory at those times). As n approaches infinity we would expect tzero[n] to approach infinity or to approach whatever time there is a collision in the point particle model of the system.

24. Sep 10, 2015

### atyy

A different type of question, but probability and continuum-related is : is entropy fundamental?

If entropy is fundamental, then it seems that the continuum cannot be fundamental, since the entropy is difficult to define for continuous quantities. For example, one may need "special coordinates", such as canonical coordinates in Hamiltonian mechanics where the Jacobian determinant is 1.

If things are continuous, could it be that the mutual information or relative entropy is more fundamental than the entropy?

25. Sep 15, 2015

### bcrowell

Staff Emeritus
You might be interested in this historical paper:

Blaszczyk, Katz, and Sherry, Ten Misconceptions from the History of Analysis and Their Debunking, http://arxiv.org/abs/1202.4153

They claim that the conventional wisdom about a lot of the history was wrong. In particular, they claim that there was a rigorous construction of the reals ca. 1600.