Are we digital or analog?

  • Thread starter Naty1
  • Start date
  • #26
1,838
7
The entropy of a system of finite volume is a finite number if the energy is finite. Even if you allow for the energy to be arbitrarily high, the number of possible states in the volume is bounded via the holographic principle.
 
  • #27
152
0
The entropy of a system of finite volume is a finite number if the energy is finite.
This is true if and only if the phase space is finite-dimensional, and so it does not include systems with an infinite number of degrees of freedom, such as (dynamic) continuous spacetime. Therefore the argument is circular.
 
  • #28
1,838
7
This is true if and only if the phase space is finite-dimensional, and so it does not include systems with an infinite number of degrees of freedom, such as (dynamic) continuous spacetime. Therefore the argument is circular.

It may be that gravitational effects that will cause black holes to appear if you go to high enough energies (i.e. to short enough length scales) is a manifestation of Nature being formally describable and hence not a real continuum.
 
  • #29
152
0
formally describable and hence not a real continuum.
Why can't nature be both formally describable and be a continuum?

After all, the equation 5x - 2x = 3x is a formal description of an infinite amount of information, and in fact most of physics consists of formal descriptions of infinite amounts of information, such as a trajectory x(t) = t^2. The point is that using mathematics we can and do give finite formal descriptions that contain an infinte amount of information, and we are especially good at doing this when the infinite amount of information is "continuously" connected because it's domain is continuous e.g. differential equations, which are certainly the heart of classical physics and continue to play a central role in quantum physics, allow compact expressions to capture a staggeringly infinite amount of information.
 
  • #30
1,838
7
The alleged "infinite amount of information" isn't really there. With a finite number of bits you cannot possibly define a structure than needs an infinite amount of information to be fixed. I'll defer to logicians to make rigorous statements on this topic, but my understanding is that the reason why things like the continuum hypothesis are undecidable is precisely because with only finite number of axioms there is no way you could precisely define the uncountable continuum.

This is also mentioned here:

http://www.earlham.edu/~peters/courses/logsys/low-skol.htm

The proof for LST relies on the fact that there are at most countably many descriptions of anything, viz. names, sentences, paragraphs, books... There are at most countably many strings of symbols (when the strings are finite in length). This fact is easily proved by arithmetizing the alphabet of our language of description. Each wff then becomes a natural number. Since there are only countably many natural numbers, there are at most countably many wffs to do the describing.

One countable model that is always available for inspection, if only to demystify LST a bit, is the interpretation in which the terms of the language are assigned to their own tokens, or to the typographic strings which express them. We've seen that by arithmetization there are at most countably many such strings. Hence, even if the intended interpretation of the marks on paper refers to the uncountably many real numbers, one obvious alternate interpretation refers only to the countably many marks on paper that comprise the system.

As the typographic interpretation of S shows, the interpretations with merely countable models will be (or may as well be) non-standard. The "meaning" of the marks changes at the same time as the domain. The theorem which in the intended interpretation made some assertion about uncountable reals speaks of something entirely different in the countable LST models.

Remember, a formal system "about the reals" is really a system of wffs of some formal language. The language is inherently uninterpreted. We might give its symbols some interpretation, and on that intended interpretation we may say that the system is "about" the reals. LST asserts that every consistent first-order theory can be intepreted as being "about" some set of things no more numerous than the natural numbers, even if we thought it was --indeed, even if under another interpretation it is-- "about" the uncountable reals.

LST does not deny the possibility that some system S might have uncountable models alongside the critical countable model. In that sense, S might "succeed". LST qualifies this success by giving S other models whether it wanted them or not. How does this detract from S's success? In only one way: S is thwarted if it aspired to capture its intended domain unambiguously or uniquely.
 
  • #31
1,838
7
It is actually easy to construct a counterexample.

Suppose that using a huge computer you simulate a planet with mathematicians and physicsits living on it. The computer computes everything down to the atomic level, so it would be able to reproduce most of the mathematical and physics results right until the 19th century. You would thus expect to see mathematicians developing calculus based on the uncountable reals being developed by your digital mathematicians.

But the uncountable reals do not exist at all in their digital world! Their world is even less than countable, it is finite. All possible states the computer can be in, can be specified using a finite number of bits.

So, what is really ging on is that people can represent phenomena in their world using some abstract rules which involves manipulating finite bistrings. But an interpretation about "uncountable reals existing" does not have to be correct.

The only way you could prove that the continuum really exists is by constructing a machine that produces results that are not formally describable, e.g. the so-called "rapidly accelerating computer" I wrote about earlier in this thread.
 
  • #32
5,601
40
Roger Penrose has some interesting insights in THE ROAD TO REALITY, my version is 2004.
He does not resolve the issues here but perhaps his insights offer some perspective:

Chapter 16:
There are some...who would prefer a universe..that is finite in extent..only finitely divisible..so that a fundamental discreteness might begin to emerge at the tinest levels...(it's distinctly unconventional..but not inherently inconsistent. In the early days of quantum mechanics.... a hope was not realized by future developments...that the theory was leading to a picture of discreteness at the tinest levels. In the successful theories of our present day we take spacetime as a continum even when quantum concepts are involved and ideas that involve small scale discreteness must be regarded as 'unconventional'....It appears, for the time being at least, we have to take the use of the infinite seriously.
Then follows several sections involving "Puzzles in the foundation of mathematics" sets,classes, Godels Theorem which seems inconclusive and introduces further analyses in later chapters.

In 31.1, regarding the Holographic conjecture/Principle, Penrose says:
A reason for hoping (Maldacena's) ADS/CFT is true appears to be that it might provide a handle on what a a string theory could be like, without resorting to the usual pertebative methods with all the severe limitations such methods have.

Section 33.1 is also interesting: "Theories where geometries have discrete elements" followed by his own quite different "twister theory".
 
  • #33
152
0
I'll defer to logicians to make rigorous statements on this topic, but my understanding is that the reason why things like the continuum hypothesis are undecidable is precisely because with only finite number of axioms there is no way you could precisely define the uncountable continuum.
No, there is no problem with defining the continuum with a finite number of axioms. The continuum hypothesis does not pertain to this at all.

The real numbers are the unique complete totally ordered field; students are exposed to the rigorous construction as senior undergraduates or beginning graduates.

The continuum hypothesis is the claim that the cardinality of the real numbers is equal to the cardanality of the power set (set of all subsets of) the natural numbers. The continuum hypothesis is undecidable in ZFC. There is no problem with any of these things being rigorously defined in a finite axiomatic setting.

The Löwenheim-Skolem Theorem
First of all, countable != discrete. The rational numbers Q are countable, but in fact every point has infinitely many arbitrarily close neighbors!

Furthermore, the set of analytic functions from Q to Q is countable, and so smooth physics could be done entirely over the countable rationals if desired.

Suppose that using a huge computer you simulate a planet with mathematicians and physicsits living on it. The computer computes everything down to the atomic level, so it would be able to reproduce most of the mathematical and physics results right until the 19th century.
Here you go again assuming the world can be simulated by a finite digital computer. What is the basis for this assumption? If the world were a continuum, then in principle I could encode all the information on the internet into a scratch on a rod.

So, what is really ging on is that people can represent phenomena in their world using some abstract rules which involves manipulating finite bistrings.
Mathematics is much more than symbolic manipulation, and a comuter which contains only the finite-bit strings of various mathematical papers without containing the semantic meaning of these has not captured all of the information.

The point is that these finite-bit strings refer to an infinite number of possibilities.

The only way you could prove that the continuum really exists is by constructing a machine that produces results that are not formally describable, e.g. the so-called "rapidly accelerating computer" I wrote about earlier in this thread.
You can't prove anything about the physical world, ever. But you saying we can't prove continuum physics is a world a part from saying that continuum physics cannot possibly be the case because of some information perspective.
 
  • #34
1,838
7
No, there is no problem with defining the continuum with a finite number of axioms. The continuum hypothesis does not pertain to this at all.
According to Chaitin it does.
 
  • #35
1,838
7
The real numbers are the unique complete totally ordered field; students are exposed to the rigorous construction as senior undergraduates or beginning graduates.

Yes, but that doesn't mean that real numbers defined in this way make any physical sense. Almost all real numbers are uncomputable and as far as we know the physical world is computable.
 
  • #36
152
0
According to Chaitin it does.
His work is considered to be controversial, it is of a philosophical nature, and it is not widely accepted by mathematicians. Fortunately we do not need to argue about the existence of uncountable sets, since the rationals are countable, and the set of analytic functions from Q to Q is countable, and everything we do with the continuum in physics could be translated to smooth rational functions. (Discrete Or Uncountable) is a false dichotomy, the rationals are a counterexample.

and as far as we know the physical world is computable.
I disagree, you keep coming back to the same circular assumption. To the contrary, as far as we know the universe is best described by quantum field theory. Please tell me what theory captures the world more accurately than QFT and suggests that the "physical world is computable."
 
  • #37
turbo
Gold Member
3,077
46
We may have gotten to the point at which the determination of whether space is discontinuous on small scales is testable. A prediction of LQG (or a variant of LQG) is that very high-energy gamma rays will interact with the fine structure of the space through which it propagates, and will be slowed more than lower-energy gamma rays. It is entirely possible that such an effect (hinted at by a few observations so far) might be produced by variables at the source of the GRB, but if it can be shown that such delays are proportional to the redshift of the source, LQG and discretization of space will have gained a lot of traction. If delays are proportional to redshift, then the idea that all the gamma rays are emitted at the same time, and that dispersion effects cause the delays gains credence. Fermi hasn't been observing all that long - lots more operational time and observations should shed some light on this question.
 
Last edited:
  • #38
apeiron
Gold Member
2,013
1
Again, either/or is the conventional bind people get themselves into. That would be the false dichotomy. If we listen to what dichotomies really tell us, we would instead be interested in understanding how opposing extreme can both be true, both be fundamental, both offer a vantage point on the production of realities.

Take for example category theory which breaks the mathematical world into the dichotomy of objects and morphisms (the discrete object/the continuous morphism). After many years floundering around with set theory and null sets, maths gave up either/or to embrace a fundamental duality (in interaction).

I could also mention the dualities that have emerged as central to string theory - Civilised may appreciate the dichotomy inherent there. And the switch in view from the local/discrete to the global/continuous.

You cannot turn in any direction in maths, science or philosophy without bumping into lurking dualism or dichotomies. Which is why it is really important to understand there are other choices of reaction apart from getting stuck in the eternal rut of either/or.

And I may as well say, from the view of modern epistemology, another way this discussion is getting bogged down is the confusion between models and simulations. Count Ibis is mostly thinking about simulations.

Simulation is about recreating what is "out there". So you have to represent both the general AND the particular. All the information contained in a system must be recreated.

But models are instead about the extraction of generals. Particulars are discarded in the creation of the models.

So for example any physical equation standing for a natural law. The general relationship is represented by something like E=mc^2. Informationally this is so compact it can go on a t-shirt. Then to use the model, you plug in measurements. You plug in local particulars like the energy or mass that is relevant to the prediction in hand.

Models involve a reduction of represented information. And the more that can be discarded, the better the model.

Simulation goes the other way. In principle, you would want to represent every last bit of information and so fully recreate some actual (particular) system. There then arises a practical cut-off question because our information representing resources are usually discrete (digital) and so we face an infinite trajectory to arrive at a representation of something (infintesimally close even) to the continuous (or analog).

Following this, perhaps we ought to ask whether the number line is a model or a simulation?
 
  • #39
1,838
7
You go from finite to countable in the infinite volume limit if you have a cut-off at some momentum: The number of physical states of a system contained in a finite volume is finite.

QFT is computable, in the sense that you can simulate it on a computer to any desired degree of accuracy.
 
  • #40
5,601
40
I was checking on another subject and came across the following: Light is quantized; Agree or disagree? Anyone care to suggest what it means?
Wikipedia, http://en.wikipedia.org/wiki/Photons#Early_objections, last paragraph

A few physicists persisted[39] in developing semiclassical models in which electromagnetic radiation is not quantized, but matter appears to obey the laws of quantum mechanics. Although the evidence for photons from chemical and physical experiments was overwhelming by the 1970s, this evidence could not be considered as absolutely definitive; since it relied on the interaction of light with matter, a sufficiently complicated theory of matter could in principle account for the evidence. Nevertheless, all semiclassical theories were refuted definitively in the 1970s and 1980s by photon-correlation experiments.[Notes 2] Hence, Einstein's hypothesis that quantization is a property of light itself is considered to be proven.
 
  • #41
Hurkyl
Staff Emeritus
Science Advisor
Gold Member
14,916
19
The continuum hypothesis is the claim that the cardinality of the real numbers is equal to the cardanality of the power set (set of all subsets of) the natural numbers.
No, that is known for sure. The continuum hypothesis is that there are no cardinalities between that of the naturals and that of the reals.
 
  • #42
Hurkyl
Staff Emeritus
Science Advisor
Gold Member
14,916
19
Take for example category theory which breaks the mathematical world into the dichotomy of objects and morphisms (the discrete object/the continuous morphism). After many years floundering around with set theory and null sets, maths gave up either/or to embrace a fundamental duality (in interaction).
I can't make heads nor tails of what you're trying to say here.
 
  • #43
apeiron
Gold Member
2,013
1
I can't make heads nor tails of what you're trying to say here.
Just making the point that everywhere you turn when people are trying to make deep distinctions, you find dichotomies emerging. As in category theory. Definitions by mutal exclusion, followed by the interaction of what has been created.

You could argue that object/morphism is not exactly a discrete/continuous distinction. But it is close in spirit. And then discrete/continuous is not itself the most fundamental dichotomy. Local/global would be a "deeper" level of generalisation.

If every field depends on dichotomies - science, metaphysics, maths - why are people not more familiar with the logical principles involved here? Why this mania for either/or when the disciplines themselves rely on "both"?
 
  • #44
What happened to cellular automata models of physics?
 
  • #45
I think these questions are not necessarily opposed. We could for instance live in a "pythagorean universe", a universe in which the simplest objects are neither discrete nor continuous, but "pythagorean".

I wrote a paper in philosophy that addressed this question recently.
 
  • #46
1
0
I think that David Deutsch generally has the right approach to this question. In short, he thinks that “within each universe all observable quantities are discrete, but the multiverse as a whole is a continuum. When the equations of quantum theory describe a continuous but not-directly-observable transition between two values of a discrete quantity, what they are telling us is that the transition does not take place entirely within one universe. So perhaps the price of continuous motion is not an infinity of consecutive actions, but an infinity of concurrent actions taking place across the multiverse.”
 

Related Threads on Are we digital or analog?

  • Last Post
Replies
13
Views
4K
  • Last Post
2
Replies
25
Views
4K
  • Last Post
Replies
10
Views
1K
  • Last Post
Replies
5
Views
6K
Replies
1
Views
2K
Replies
1
Views
4K
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
2
Views
2K
Replies
3
Views
2K
Replies
10
Views
903
Top