Naty1 said:
jambaugh:
Be cautious! Such firm "beliefs" may blind you! : such "beliefs" have tripped up physicsts for all of history and prevented them from understanding new theories and experimental findings .
This caution runs both ways... but if you look at history the mistakes made in "too dogmatic beliefs" have been in reification of models and failure to pay attention to operational meaning. Einstein was able to revise his view of time and thence unify space-time by acknowledging that "time is what a clock indicates" and "distance is what a measuring rod measures". Hence the reality is in the dynamics of the clock and the measuring rod. His open-mindedness on this point allowed him to then generalize the previously fixed relationships between these. However the success of his theory led to the opposite position, with followers taking the geometric
model as an ontological fact. This tends to be the nature of scientific progress.
also recall that light was once viewed as traveling in ether, mass was "solid", space and time were "fixed and immutable", a proton is a fundamental particle, dark matter and dark energy are "impossible" (just mathematical constructs) etc,etc,etc.
Again the aetheric interpretation of light was a mistake of reifying a model. It was in acknowledging that the reality of the aether was not necessary to describe the dynamics of light which led to relativity. Relativity doesn't assert the ether is real and doesn't assert the ether is unreal. It shows that the question is irrelevant because the physics is in the empirical observations of how light behaves.
Dark matter was and still is simply matter which is not visible due to its not being radiant stars. Speculations about
exotic dark matter grab headlines but are still fringe speculation. Personally I suspect the majority of it will end up being stellar sized black holes.
Let me add that the dark matter requirement is extrapolated from weak approximations to the full GR description of the dynamics of galaxies. I've seen at least one paper which suggests a fully general relativistic treatment may greatly reduce or eliminate the necessary amount of dark matter to get predictions to agree with observations. Again the "open mindedness" goes both ways. We shouldn't take dark matter as dogma.
Dark energy is nothing more than Einstein's cosomological constant relabeled. I believe it is just the (non-flat) boundary conditions in cosmological applications of Einstein's equations. There is also the possibility of a systematic misinterpretation of the doppler shift of distant objects as purely due to recessional velocity. In the curved space-time cosmologies there is also an effect of time dilation depending non-linearly on distance. I'm not sure the computer models use by cosmologists take this into account. I know some papers I've read in the past make the mistake of taking the hyperbolic shape of
the embedding of a deSitter space-time within an euclidean coordinate system as the literal Big Bang-esque expansion of space over time when the proper spatial cross-sections of the deSitter manifold for a given observer does not change size over time. This mistake again occurs from viewing this manifold as a physical object rather than a geometric realization of the relationships between physical objects. (also due to not correctly visualizing the proper embedding within a 4 space + 1 time Minkowski space).
Note "fundamental" has a contextual meaning. Protons are still "fundamental" in nuclear chemistry as electrons + elemental nuclei are "fundamental" in chemistry. Note also that it was not dogma about the fundamentalness of the proton which inhibited the development of the standard model. Rather it was the absence of data. As soon as accelerators reached higher energies and began giving us data and we began seeing the huge particle spectrum and theorists immediately began speculation about parton models of the nucleons.
Just keep an open mind. For example, if matter is both a particle and a wave,and equivalent to energy, why can't space and time be as well?? (after all, they all came from the same place: nowhere ("empty" space))... Nobody knows.
My mind is quite open, but I am skeptical of many of the "kludges" (such as dark energy/matter and inflation) to get them to fit empirical data.
BTW Matter is not "both particle and wave" it is neither. It is rather quanta which behavior we translate into the old classical "wave" or old classical "particle" paradigms when we wish to describe specific aspects of their behavior in classical terms.
I am open minded but I don't buy every new speculation just because it generates juicy headlines in the popular media. (E.g. FTL tunnelling). Neither do I take orthodox views (such as the Big Bang Theory) and (Renormalized Field Theory) as "T"ruth. I rather take these as tentative theories with a large body of empirical confirmation which future alternatives must also account for.
Now my position (space-time is parametric rather than physical) is not an ontological dogma it is an acknowledgment of space-time's operational meaning. Recall Einstein's caution to look at what physicists do in the lab and ignore what they say.
We use coordinates as parameters, specifically as parameters of transformation between classes of similar modes of measurement. E.g. when comparing two spin measurements of a given quantum system we express the relationship between these measurements in terms of space-time translations and of rotations and velocity frame boosts (of the measuring apparatus). In short we select an element of the Poincare group parameterized by a duration, spatial displacement, rotation angle and boost pseudo-angle.
Quantum theory then tells us how to represent this group element in the dynamics of the quantum system so that we can identify equivalent measurements in the sense of being exactly correlated. It at the same time gives us the transition probabilities between not-quite correlated measurements.
This is all we need to describe outcomes of quantum experiments and this thus is all the meaning we need for space-time position--orientation--velocity. Adding additional meaning i.e. overlaying an ontological interpretation imposes additional assumptions and thus is as you phrase it being less than "open minded".
I assert that it is exactly the over reification of parametric quantities like space-time which leads to the over-counting of physical degrees of freedom resulting in the divergences we find in QFT. Renormalization is a "quick fix" which gives good answers but doesn't address the fundamental problem...and as we have learned cannot always be applied as with canonical quantum gravity. I think that string/brane models perpetuate this problem by adding additional non-physical structure...so much so that the researchers are lost in the beauty of the mathematics and have little to say about physical nature.
Maybe I'm wrong. The proof will be in the next (empirically) successful class of theories. I'm working on my own pet theories based on my assertions. Give me 0.01% of the grant funding which has been poured into string theory and I might be able to make substantial progress.
P.S. Pardon the length of the reply but you struck a nerve.