Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

String theory why?

  1. Jan 29, 2010 #1
    It seems to me that string theory is a highly complex way of dealing with the need to avoid zero distances in our mathematical description of the physical world. Wouldn't it be much easier to just say let's use 10^-25cm as a limit and not allow anything to get closer than that in our mathematical theory of the physical world. The infinities go away. Yes with 11 dimensions you have gobs of symmetry. Maybe too much?
     
  2. jcsd
  3. Jan 29, 2010 #2

    arivero

    User Avatar
    Gold Member

    String theory is not motivated by the problem of infinities. From Wilson to Kreimer, a lot of understanding has happened about this particular problem, starting as you say by the ability to do regularisations.

    The only physical motivation of string theory is that it happened to be the interpretation of a Lagrangian able to produce some kind of relationship between scattering laws for particles binded by the strong force. Nobody proposed string theory as a regulator nor as a generic "lets consider more dimensions"; this later thing (the "brane scan") happened a lot after the initial calculations of strings.

    Now, my opinion is the "just let's use a limit" approach has been as unproductive as strings. It has supported the "effective approach to quantum field theory".
     
  4. Feb 1, 2010 #3

    Fra

    User Avatar

    I think it's still possible to do this in a different ways. Many approaches I've see, like you say "just uses a limit".

    I've always felt there is a physical significance in the limit that is too rarely acknowledged. If you take the observer perspective seriously, and acknowledge that an observer is real and no mathematical abstraction, there should be a natural limit for each observer as to what it can resolve and encode relating to the observers complexity. A limit of distinguishability.

    If we can establish a relation between the complexity of the observer (maybe relating to mass or communication channel size in a holographic sense) then this cutoff can be given a deeper physical meaning and would be nothing like the simplistic "just let's use a limit" approach.

    Now if anyone claims that the smallest distinguishable structures in some sense can be described as strings, then I think that should be explainable. I have already encountered that even if you just start with distinguishable "events" (without geometric embeddings) these events can by certain constructions be found to naturally be organized into ordered sequeneces, which in the high complexity limit easily can look like a "string" (but without embedding) where the index of the string is the ordering index of the initial set of events.

    But there is still a huge difference between this and taking the strings in a background as continuum starting points. I just see a string as the simplest possible distinguishable continuum structure (or high complexity limit). But I don't see a physical reason for using continuum and real numbers in a starting point. I find it to be a big leap of reasoning with respect to my preferred starting points.

    /Fredrik
     
  5. Feb 4, 2010 #4

    vld

    User Avatar

    There is no need in establishing the "just-let's-use-a-limit" approach because this approach is already in use based on the causality principle and the corresponding limit on the speed of interactions (usually called the speed of light). This limit automatically establishes the limits on the maximal possible energy (mass, space-time curvature, distance etc). For example, an infinite amount of energy would result in infinite curvature, which would mean an infinite acceleration and speed exceeding the speed of light, which would contradict the causality principle.
     
  6. Feb 4, 2010 #5

    Fra

    User Avatar

    I think what I advocate is more than the regular "if we focus enough energy all we will see when zooming is a black hole". This is certainly one aspect, but the other aspect is that not only does focusing energy strongly deform the system under study (until the extrem point of creating a black hole), what is the main point is that it also drains the observer on energy and complexity, which ultimately destructs the measurement operator itself. So in some cases the observer is destroyed/annihilated or consumed by the system under study long before the black hole is observed - this latter point, is something that is not that relevant relative to a macroscopic laboratory, but which can be relevant in cosmological models, and in the inside-picture of particle experiment when you consider how one particle infers information about a fellow particle of similar mass. So I think the latter point may be relevant to understanding why the microscopic action looks the way it does. Ie why do we have the weak, em and strong forces, related like they are?

    So I think there is an additional point to this, beyond the standard argument of "energy ultimately creates a black hole" that you can read in almost every introduction to QG. The latter point, is almost never spelled out to my knowledge. Probably because it mixes the description of a scientific theory in general, and the description of a physical theory of matter in specific. But I think they do mix. I see this as a consequence if you take the measurement ideal seriously and not only consider the communication CHANNEL (in a certain context) but also considers what is going on at each node. As I see it, the structure of regular QM (fixed hilbert space etc) fails to do this in what I think is the right way.

    /Fredrik
     
  7. Feb 4, 2010 #6
    Easier, yes, more accurate, certainly not. Too much good sciences has come in the past from considering such extremes...such as Einstein's "catching up to observe a light beam"...which he imagined as a teenager....

    To do so would to me sound too much like the "Forbidden Zone" in Plant of the Apes....[which showed relics that gave evidence man had once been ascendent over the apes....]

    We don't establish such arbitrary limits in general theoretical considerations because to do so would (a) acknowledge a flaw in our current theories unless there is reason to establish a particular limit and (b) would hide from consideration insights which might enable us to expand/improve and gain further insights in existing theories...

    much better to let nature set the limits and for us to try to detect/understand them.
     
  8. Feb 4, 2010 #7

    DaveC426913

    User Avatar
    Gold Member

    So, how can particles transfer energy or combine into other particles of they can't ever come into contact? At some point, you're forced to point a microscope at that last 10^-25cm gap.
     
  9. Feb 4, 2010 #8

    vld

    User Avatar

    Yes, of course, if one would neglect the structural complexity of the measuring and measured systems, then to interpret the results of observations one would require a theory containing an uncertainty constant because these results will be affected by a lot of factors (such as the sizes and shapes of the involved systems, their deformations as they interact with each other, etc.). Nothing can be measured accurately if the detailed structure of the measuring device is unknown.
     
  10. Feb 4, 2010 #9

    Fra

    User Avatar

    Yes, then I guess either we take the position that

    1) the measuring device has to be certain, but then we are back at standard QM where we need a "classical" measurement device in order to understand it. Note that the decoherence view is not full solution here, since it violates the complexity bound of the given oberver.

    or

    2) I think in a general situation, the measureing device is not known with certainty, right? So perfect deductives are impossible. The question is if we reject this as an unacceptable conclusion, or embrace it as a feature of reality, including it's incompleteness, and try to make sense out an apparently selforganising but still partially undecidable reality, where increasingly more accurate measures are represented due to evolving from uncertainty in an inductive sense.

    My impression is that most people, reject undecidability (or dismiss it into practical matters of no deep significance) for the reason that it's hard to make sense out of science without deductive logic.

    /Fredrik
     
    Last edited: Feb 4, 2010
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook