Why I am REALLY disappointed about string theory

  • #601
Let's compare the situation to QED.

In QED nobody ever proved that iron and copper do exist. Nevertheless for various reasons we simply believe that iron and copper exist "in QED". Their existece is taken for granted and we can calculate their properties (specific heat, conductivity, phonon spectrum, ...). We are rather successful with these calculations using effective models.

In string theory the existence of something like iron, copper, etc. has been shown to exist most likely. Starting with something like iron we can calculate certain properties (masses, coupling constants, ...) and again we are rather successful.

So the problem is not so much that we fail at the level of iron and copper, the problem is not that we cannot prove that iron and copper can exist. It seems that in that sense string theory is rather successful. Of course there are many technical details that have to be worked out, but that was the same in condensed matter physics as well.

The problem is that we seem to argue on the level of phonons, excitons etc. We are still on an effective level, we are not studying the fundamental theory. And we are not able to talk about the "environmental conditions" required for the existence of iron and copper. In QED we are (to a certain extent) able to specify the conditions required for the formation of certain substances, in nuclear physics we can even study / specify the conditions under which certain elements and isotops are formed. In string theory we can't do that.We can specify certain selection principles (topological constraints, moduly stabilization, ...) which are necessary pre-conditions, but the true dynamical mechanism behind the scenes, the "vacuum selection", "vacuum tunneling" etc. cannot be addressed.

My concluson is still that we lack fundamental d.o.f., background independence and an off-shell formalism - or perhaps something totally different.
 
Physics news on Phys.org
  • #602
tom.stoer said:
My concluson is still that we lack fundamental d.o.f.

Do you really think this must exists? Why?

The conclusion I make is that fundamental (ie fixed eternal observer invariant) dofs have no place in an intrinsic measurement theory, because such a thing isn't inferrable (measurable or observables by an inside observer).

My conclusion is that the ultimate BI, means that there doesn't even exist a background with fundamental dofs. This is why a new thinking is needed. This is (IMHO) also why all there seems to be sets of dualities, sets that moreoever also seem to be uclear.

I think this is a real situation, the question is what it means and how to make sense of that.

(Of course my preferred answer is that we should see theories not as descriptions but as interaction tools, thus all theories evolve, and no theory can be described from the outside, but rather you can only described it (from the outside) in the sense as from another theory. This is the inferential perspective.)

/Fredrik
 
  • #603
Fra said:
(Of course my preferred answer is that we should see theories not as descriptions but as interaction tools, thus all theories evolve, and no theory can be described from the outside, but rather you can only described it (from the outside) in the sense as from another theory. This is the inferential perspective.)

But a way to construct a universe so that the inside people can make theories is to make the universe using some laws. ie. if I were God, and I made a random universe with some inside observers, could I consistently enforce that the observers can learn at least locally useful laws even though there were none that I followed? If that is possible, then the outside view must, by assumption, fail. But if even God can't do that, then the outside view is justifiable from an inside view.
 
  • #604
smoit said:
The question above might as well be technical in its nature. Let's say we want to obtain 3+1 large space-time dimensions while having the rest of them compactified on a CY 3-fold.
Let us restrict to the cases where the large dimensions are either Minkowski or nearly de Sitter to avoid the cosmological solutions with a big crunch.

Apriory, even if we assume such a compactification, we don't know if the compactified dimensions can actually remain compact until we find a reliable mechanism to stabilize all the moduli that parameterize the deformations of the internal metric. A canonical example is the Type IIB flux compactifications, where fluxes only stabilize the complex structure moduli and the axio-dilaton while the Kahler moduli remain unfixed. Stabilizing the remaining moduli is paramount for keeping the internal manifold compact. However, this task is highly non-trivial. In order to fix the Kahler moduli one must satisfy certain topological conditions that determine the number of fermionic zero modes in the corresponding non-perturbative contributions to the superpotential, which is possible in principle but extremely hard to achieve in practice, especially when charged chiral matter is present at various intersections. In addition, there is something called the overshoot problem, which in the case of multiple Kahler moduli may become a very severe problem. So, the bottom line is that in the vast majority of cases one cannot stabilize all the moduli by currently known mechanisms because one cannot generate the potential due to the topological constraints.

So, the next question would be, is it possible to stabilize all the moduli assuming a compactification down to 2+1 or 1+1 or even 0+1 dimensions?

This is not a conceptual but rather a technical question, which would require some new ideas. I personally don't know if it would be possible to have a stable compactification of, say, M-theory on a CY 5-fold or some toroidal orbifold so that all 10 spatial dimensions are compact but the vacuum energy is nearly zero. My guess is that it would be a really tough problem and it is quite possible that there are just not enough ingredients to generate a potential to stabilize all the moduli, in which case, some internal cycles will never be stabilized and will get as large as the corresponding dynamics allows them to get.

But wouldn't the conjecture of http://arxiv.org/abs/0906.0987 mean that there are some stable solutions in 6D too?
 
  • #605
atyy said:
But a way to construct a universe so that the inside people can make theories is to make the universe using some laws.

In some sense yes that's a "possibility", but not at a rational one because the laws you refer to aren't inferrable so it is IMHO in conflict with what I consider to be the most basic trait of a intrinsic measurement theory. The laws (constraints) are just put in there as structural realism elements, ie without rational basis. Because rationality is as I see it ensures by counting evidence(ie interaction history), and for that you need an observer.

So in that view, the laws act as forcing constraints, relating the inside views. But these constraints are non-inferrable, the are assume to exists as elements of structural realism.

This is a special case of the more general view I advocate. In the more general view, this view of your corresponds to a special case where the global/external constraints/laws of yours are replaced by inferrable constraints (wich then of course are evolving) but then you consider an equilibrium situtation where all inside observers due to beeing at equilibrium, are perfectly in consistency with the unobservable constraint, although no inside observer can infer it.

In my view, the distiction here is extremely important conceptually. But I've learned that I am in minority here. One argument against my view is that it implies loss of decidability. But the whole point is that if you construct a deductions based on an non-inferred assumes superstructure, this decidability is nevertheless irrational. All I try to do, is to acknowledge this fact, not deny it, and instead focus on rational induction, not irrational deductions just becuse it "looks" more exact.

This again boils down to how you understand the laws of nature. Are they god-given forcing constraints, or are they rationally inferred action guides?

/Fredrik
 
  • #606
It would be good to keep the discussion in this thread centered at string theory.
 
  • #607
suprised said:
It would be good to keep the discussion in this thread centered at string theory.

I certainly didn't mean to divert anything and there will be no expansion on those points from my side in this thread. On the contrary did I mean to put in a context this issue with theory of theories, and in what sense you can defines measures on spaces of theories, that is at the heart of some of the open issues in ST.

I suppose we aren't only discussing ST from the perspective of an already comitted string theorist, because then many of the intersting reflections gets automatically shaved out, meaning the entire analysis "what we think of ST" gets crippled.

So please continue, just consider what I said as a parallell comment. I was not addressing technical details of strings, but rather the context in where those details lives, because I see that some confusion starts there.

/Fredrik
 
  • #608
tom.stoer said:
Let's compare the situation to QED.

In QED nobody ever proved that iron and copper do exist. Nevertheless for various reasons we simply believe that iron and copper exist "in QED". Their existece is taken for granted and we can calculate their properties (specific heat, conductivity, phonon spectrum, ...). We are rather successful with these calculations using effective models.

In string theory the existence of something like iron, copper, etc. has been shown to exist most likely. Starting with something like iron we can calculate certain properties (masses, coupling constants, ...) and again we are rather successful.

So the problem is not so much that we fail at the level of iron and copper, the problem is not that we cannot prove that iron and copper can exist. It seems that in that sense string theory is rather successful. Of course there are many technical details that have to be worked out, but that was the same in condensed matter physics as well.

The problem is that we seem to argue on the level of phonons, excitons etc. We are still on an effective level, we are not studying the fundamental theory. And we are not able to talk about the "environmental conditions" required for the existence of iron and copper. In QED we are (to a certain extent) able to specify the conditions required for the formation of certain substances, in nuclear physics we can even study / specify the conditions under which certain elements and isotops are formed. In string theory we can't do that.We can specify certain selection principles (topological constraints, moduly stabilization, ...) which are necessary pre-conditions, but the true dynamical mechanism behind the scenes, the "vacuum selection", "vacuum tunneling" etc. cannot be addressed.

My concluson is still that we lack fundamental d.o.f., background independence and an off-shell formalism - or perhaps something totally different.

I am very sympathetic to this point of view, but I also want to offer a little bit of counterpoint.

Knowing the fundamental theory is often not that helpful for doing physics at energy scales well below the fundamental scale. And I think its really important that this isn't just a technical problem, there is a lot of physics going from high energy to low energy in a truly predictive way. "Ab initio" prediction of material properties in condensed matter physics is not something we're very good at. And importantly, its not just a matter of needing slightly faster computers, simulating such systems is really beyond our current capabilities in a precise complexity theory sense. There are deep questions here about how to organize information in a quantum many body system that we still don't have a very good idea about.

I am willing to make the same claims for string theory. Of course, it would be great to have some off shell formulation or whatever. It's bound to tell us something, for example, about transitions between different vacua. But I don't expect that such a discovery would reduce string theory to technical questions or tell us very much about the particular vacua we happen to find ourselves in.
 
  • #609
atyy said:
But wouldn't the conjecture of http://arxiv.org/abs/0906.0987 mean that there are some stable solutions in 6D too?

Yeah, that's certainly possible because there are fewer moduli to worry about, but my point was to show that the more dimensions are compact, the more types of moduli one must stabilize and the more difficult the task of keeping the dimensions compact becomes. I was simply suggesting that there may be a bound on the number of dimensions that can remain compact but until one sits down and starts calculating this is just a speculation. The question is - suppose one starts with ALL spatial dimensions compact, what is the maximum number of dimensions that can possibly remain compact, i.e. the corresponding moduli can be dynamically fixed, while the rest of them have runaway directions? This is a technical question that one should be able to answer already, at least for the simple examples I had suggested, without any background independent formulation, etc. All these questions about vacuum selection ASSUME that one can obtain stable vacua but my point is that vacuum stability, i.e. having a robust dynamical mechanism for keeping all the internal dimensions compact, may just as well be a possible selection principle in addition to some other ones.
 
  • #610
There is an interesting tension or difference in viewpoint here between Tom and Brian. Tom suggests adding a couple of more weakpoints to Suprised list of string program "wrong steps". Actually these two are not "wrong steps" as much as they are "steps not taken".
Brian argues that perhaps they are not important steps to take because how would they help us "do physics"?

tom.stoer said:
OK, thanks for the explanation. I think I have to study the relevant papers more carefully. But I think we should really add background-dependence and focus on on-shell formulation to your list ...

tom.stoer said:
Let's compare the situation to QED.
...
The problem is that we seem to argue on the level of phonons, excitons etc. We are still on an effective level, we are not studying the fundamental theory...

My concluson is still that we lack fundamental d.o.f., background independence and an off-shell formalism - or perhaps something totally different.

Physics Monkey said:
I am very sympathetic to this point of view, but I also want to offer a little bit of counterpoint. ...
Knowing the fundamental theory is often not that helpful for doing physics at energy scales well below the fundamental scale.
... Of course, it would be great to have some off shell formulation or whatever. It's bound to tell us something, for example, about transitions between different vacua. But I don't expect that such a discovery would reduce string theory to technical questions or tell us very much about the particular vacua we happen to find ourselves in.

My comment is there is an understandable human tendency to minimize the desirability of something one does not see how to get (as in the story of the fox and the grapes). But I think that one is missing half the fun if one does not develop a fundamental theory explaining why an effective means of calculation works.

The successful application of math techniques at low energy can then serve to increase confidence in the fundamental theory, and the fundamental theory, in turn, can lead to ideas about other areas of interest (like what happened at the start of expansion or what happens at the pit of an astrophysical collapse.)

A merely effective means-of-calculation, with no supporting foundation claim on reality, remains partly at the level of superstition ("we do it because it seems to work") and does not realize its full potential to extend our understanding. That's why I say you "miss half the fun."

That's why I'm inclined to agree with Tom that continued background dependence is a serious deficiency in the String program. I don't think we are criticizing the mathematical theories themselves but if there is any disappointment it is with the research emphasis in the program---the prevailing direction or lack of.
 
Last edited:
  • #611
marcus said:
There is an interesting tension or difference in viewpoint here between Tom and Brian. Tom suggests adding a couple of more weakpoints to Suprised list of string program "wrong steps". Actually these two are not "wrong steps" as much as they are "steps not taken".
Brian argues that perhaps they are not important steps to take because how would they help us "do physics"?
Tension - not more - not less.

Given the Hamiltonian of QED and QCD we are neither able to predict the existence of water, nor are we able to calculate its triple point. Nevertheless we agree that having the Hamiltonians of QED and QCD (in terms of the fundamental degrees of freedom) is not completely useless :-)

There's another interesting tension in fundamental physics. One often hears that physics (even fundamental physics) is not about the "why?", it's not about "how nature really IS", but only about phenenomenology, about experimentally falsifiable predictions. Nevertheless most of us try to understand "what nature really is and how it works"; most of us are interersted in the fundamental laws. So there is a tension between "what we are really interested in" and "what can predict". Even if we agree that all physical results are either predictions about experiments or experimental results, the driving factor in physics (science) is always the "why" and the "how does it really work". So even if we agree that we will never be able to answer these questions, we should not stop asking them!
 
  • #612
Again, aesthetic criteria may or may not be relevant. The whole point of having an offshell formulation of string theory is to allow us to make certain calculations that we were not able to do in a simple manner in the usual framework (although not in principle impossible either)

However it may not! When people studied string field theory, the hope was that it would do precisely what everyone has been hoping for in this thread. However it was quickly discovered that it was in some sense less general than the onshell formulations. It really only probed perturbative physics and missed the extended nonperturbative states! Further it just made calculations horribly complicated. Sometimes that is just the way things happen and it is far from obvious why that must be so.

Here is another semi obvious aesthetic criteria! A theory ought to have an action principle.. Again, it makes certain calculations and questions nice and simple, ties into other theories that we already know a lot about and so on and so forth. However it may not be the way the world works. Even if we didn't already know examples of theories that did not have Lagrangian descriptions, you could wave your hand and argue that it may simply not be necessary in formulating whatever final theory comes along.

I guess my point is that we should not confuse ought with is. Especially when we are dealing with a theory as rigid as String theory. You simply aren't allowed to tweak the structure.
 
Last edited:
  • #613
smoit said:
Yeah, that's certainly possible because there are fewer moduli to worry about, but my point was to show that the more dimensions are compact, the more types of moduli one must stabilize and the more difficult the task of keeping the dimensions compact becomes. I was simply suggesting that there may be a bound on the number of dimensions that can remain compact but until one sits down and starts calculating this is just a speculation. The question is - suppose one starts with ALL spatial dimensions compact, what is the maximum number of dimensions that can possibly remain compact, i.e. the corresponding moduli can be dynamically fixed, while the rest of them have runaway directions? This is a technical question that one should be able to answer already, at least for the simple examples I had suggested, without any background independent formulation, etc. All these questions about vacuum selection ASSUME that one can obtain stable vacua but my point is that vacuum stability, i.e. having a robust dynamical mechanism for keeping all the internal dimensions compact, may just as well be a possible selection principle in addition to some other ones.

The problem I never see addressed (probably because there is no good answer), is what forces the theory to compactify some dimensions at all. I can't think of any convincing reason why the theory with maximal symmetry (in 10 or 11d) would not be a sweet spot for the theory to stay there. Somehow the opposite of a sweet spot seems to be required (no obvious susy in lower dimensions, no unbroken E8's, etc). Apart from anthropic reasoning, which bypasses this point, there is AFAIK no mechanism or principle known that would drive the theory away from its comfortable sweet spot into the ugly messy non-susy real world we observe.

I actually don't think there will ever be such a principle, at least in the framework developed so far. As said before, I toy with the idea that what we have discovered in terms of the many string vacua, just parametrizes the space of consistent theories that include gravity. By itself, this construct would not exhibit any preferred choice of vacuum etc. It may be another "wrong" prejudice that because string theory ought to be "complete", it would somehow pick the right vacuum for us.

As said before, perhaps string theory should simply be viewed as a generalization of Yang-Mills theory that includes gravity. Then in a similar sense that N=4 Yang Mills theory does not "predict" the standard model gauge theory, the 10/11d theories do not predict the standard model including gravity (although the latter can be consistently embedded via deformation or compactification).
 
  • #614
suprised said:
The problem I never see addressed (probably because there is no good answer), is what forces the theory to compactify some dimensions at all. I can't think of any convincing reason why the theory with maximal symmetry (in 10 or 11d) would not be a sweet spot for the theory to stay there. Somehow the opposite of a sweet spot seems to be required (no obvious susy in lower dimensions, no unbroken E8's, etc). Apart from anthropic reasoning, which bypasses this point, there is AFAIK no mechanism or principle known that would drive the theory away from its comfortable sweet spot into the ugly messy non-susy real world we observe.

I actually don't think there will ever be such a principle, at least in the framework developed so far. As said before, I toy with the idea that what we have discovered in terms of the many string vacua, just parametrizes the space of consistent theories that include gravity. By itself, this construct would not exhibit any preferred choice of vacuum etc. It may be another "wrong" prejudice that because string theory ought to be "complete", it would somehow pick the right vacuum for us.

As said before, perhaps string theory should simply be viewed as a generalization of Yang-Mills theory that includes gravity. Then in a similar sense that N=4 Yang Mills theory does not "predict" the standard model gauge theory, the 10/11d theories do not predict the standard model including gravity (although the latter can be consistently embedded via deformation or compactification).

What exactly is anthropic reasoning? In perhaps older views, it is that the initial conditions were what they were because they were what they were. But in the context of string theory, I've heard that the initial conditions were what they were because all initial conditions did in fact happen.
 
  • #615
suprised said:
Then in a similar sense that N=4 Yang Mills theory does not "predict" the standard model gauge theory, the 10/11d theories do not predict the standard model including gravity (although the latter can be consistently embedded via deformation or compactification).

But given some principle, or extra input, it could in principle predict the SM. And the same with string theory. I think it would be foolish to expect string theory to give all the Standard Model parameters without any extra input. Now this input can be the value of "N" in some SU(N) group, the fact it's a SU(N) group we need, or the fact that we need group theory in the first place. Theres no difference between these types of input. If you so desperately want a "unique" theory which predicts everything, you'd better get some no-go theorems on why you need a specific type of math. So in my view arguing about uniqueness will lead nowhere, the best we can and should do is just try to reduce the number of inputs required, one at a time.
 
  • #616
atyy said:
What exactly is anthropic reasoning? In perhaps older views, it is that the initial conditions were what they were because they were what they were. But in the context of string theory, I've heard that the initial conditions were what they were because all initial conditions did in fact happen.

An anthropic argument isn't really about initial conditions. One typically resorts to the anthropic principle when one is ignorant of the initial conditions or can't describe the vacuum state in terms of initial data for one reason or another.

One way to use the anthropic principle is the following. Imagine that we have a space of effective theories parameterized by some numbers p_i, which could typically involve coupling constants, but could be generalized to ranks of gauge groups, number of matter generations, etc. A theory is a point \vec{p} in the space of these coupling constants. If there is some energy function on the space of couplings, then one could refer to the space of absolute minima as the space of vacua.

Anthropic arguments place bounds on the p_i such that p^{(-)}_i < p_i < p^{(+)}_i. The reasoning would generally be that if a parameter was out of the specified range, the universe could not have the features that it does. For example, in http://prl.aps.org/abstract/PRL/v59/i22/p2607_1 Weinberg established an upper bound on the value of the cosmological constant from the requirement that gravitationally bound systems like galaxies were allowed to form. This is a relatively weak bound, while stronger bounds might be established in a general theory by requiring tight fits with the fine structure constant, electron mass, etc. Some values, like number of generations, would be fixed to specific values, rather than a range.

If the vacua depend on discrete values of the parameters (as in string theory), then it makes sense to ask what number of vacua lie within the anthropic bounds \mathcal{N}[(p^{(-)}_i ,p^{(+)}_i)]. In some sense, the success of the anthropic argument is reflected by the value of \mathcal{N}. One hopes that this number is small.

Initial conditions are only relevant if we have some way to compute the p_i in terms of the initial values p^{(0)}_i. In general this cannot be done for the vast majority of interesting string vacua.
 
  • #617
This continues to be a remarkably deep and interesting thread. In a moment I will try to say what I think really disappoints ME. But first this quote:
suprised said:
The problem I never see addressed (probably because there is no good answer), is what forces the theory to compactify some dimensions at all. ...

I actually don't think there will ever be such a principle, at least in the framework developed so far. As said before, I toy with the idea that what we have discovered in terms of the many string vacua, just parametrizes the space of consistent theories that include gravity. By itself, this construct would not exhibit any preferred choice of vacuum etc. It may be another "wrong" prejudice that because string theory ought to be "complete", it would somehow pick the right vacuum for us.

As said before, perhaps string theory should simply be viewed as a generalization of Yang-Mills theory that includes gravity. Then in a similar sense that N=4 Yang Mills theory does not "predict" the standard model gauge theory, the 10/11d theories do not predict the standard model including gravity (although the latter can be consistently embedded via deformation or compactification).

As I've often said, I don't criticize the string math tools which the Program has developed and I respect their potential for a variety of applications. My only criticism is of the unification Program. Perhaps research could be steered in harder more interesting directions,...as some of Tom's and Suprised's posts have suggested... But this doesn't matter now. I'll say what I think the main thing is that disappoints me:

As preamble, I think physics is not about the world but is rather about our information about the world. Both Bohr and Einstein had thoughts along those lines.
It is about observations, predictions, measurements, events detected or not detected.

It is about the relations among these pieces of information. Notice that in any experiment the information is finite and the web of relationships is finite.

Long ago the physics community made a bold presumption that the mathematical object to base all this on was a MANIFOLD, or some closely related type of continuum.

The manifold has the property that it looks the same at all scales. As you zoom in it keeps looking like Rd.

This is fundamentally incompatible with the finiteness of information. We can never confirm that what we live in is a manifold, or any of the other structures based on manifolds which have been invented, such as fiber bundles and sheaves. (And didn't 't Hooft already in the 1990s suspect it was wrong?)

So I guess the main thing that disappoints me about the String research program is probably that it seems to depend so much on conventional mid-20th century mathematics. All that stuff that math grad students were being told about circa 1960 plus or minus. Much of it manifold based or familiar from algebraic topology of that era. For one thing, that stuff has Baroque tendencies. It is not minimalist and it is not focused on finite information/relationships.

I think it was Immanuel Kant who complained of "the accursed fertility of metaphysics" and I forget who it was who adapted that in speaking of the overwhelming Baroque fecundity of mid-20th Century diff. geometry/topology. I'd like to see some members of the community find their way into lean minimalist mathematical surroundings...

If they ever do, then I'll make an informal prediction that within a short time we will hear testable predictions coming from them. It's just a hunch. Umso geringere die Mitteln, desto grösser die Kunst. Please correct the bad German (I'm trying to phonetically transcribe something someone told me in conversation in the late 1950s.)
 
Last edited:
  • #618
Well matrix theory showed that ultimately we'll get some sort of non-commutative geometry, so the usual smooth manifolds are clearly not enough. This geometry issue is equivalent to finding the "master string theory" if there is one, or at least better understanding what the fundamental degrees of freedom are in some of the theories.

And clearly the classical theory of geometry isn't enough to investigate everything about string theory. The reason is simple: classical geometry is based on "point like sources" if you like. You compute distances between points, etc. Strings are two dimensional, so they see things about geometry that we have missed so far. This is why strings lead to some interesting new stuff about geometry, like mirror symmetry. So if you'd like, you can blame the lack of progress in string theory on mathematicians for being so biased and using only "point-like" "tools" to explore geometry.
 
  • #619
negru said:
Well matrix theory showed that ultimately we'll get some sort of non-commutative geometry, so the usual smooth manifolds are clearly not enough. This geometry issue is equivalent to finding the "master string theory" if there is one, or at least better understanding what the fundamental degrees of freedom are in some of the theories.

Does string theory have "true" noncommutative geometry, or can all noncommutative field theories in string be related to "normal" theories via a Seiberg-Witten map?
 
  • #620
suprised said:
The problem I never see addressed (probably because there is no good answer), is what forces the theory to compactify some dimensions at all. I can't think of any convincing reason why the theory with maximal symmetry (in 10 or 11d) would not be a sweet spot for the theory to stay there. Somehow the opposite of a sweet spot seems to be required (no obvious susy in lower dimensions, no unbroken E8's, etc). Apart from anthropic reasoning, which bypasses this point, there is AFAIK no mechanism or principle known that would drive the theory away from its comfortable sweet spot into the ugly messy non-susy real world we observe.
Well, actually one can significantly enhance the gauge symmetries by compactifying to lower dimensions. In fact, when comparing compactifications of F-theory on CY 3-folds with those on CY 4-folds, one can obtain much bigger gauge groups in the 4-fold case than in the 3-fold case, which in turn are much bigger than those for the K3 case. The same is true for the Heterotic case, where one can also get many non-perturbative gauge groups in addition to the original E8XE8 in 10D.
So, the gauge symmetry can be tremendously enhanced in the process of compactification. We already know from, e.g. studying D-brane dynamics, that points of enhanced symmetry with extra light degrees of freedom are dynamically favored.
Likewise, at the string scale where some SUSY would be unbroken, the gauge symmetry enhancement might play some role in driving the theory from the non-compact and less gauge-symmetric phase to some more compact and more gauge-symmetric configuration.

The presence of these hidden sector gauge theories can be perfectly compatible with the low-energy physics we can probe. There might as well be a gazillion of such hidden sectors with large rank non-Abelian gauge groups that are strongly coupled at some high scale due their large beta function coefficients. These would be completely decoupled from the low energy physics but at high energy these light degrees of freedom would all be there. Take one of the unbroken E8s you mentioned, for example. It has a large quadratic Casimir (dual Coxeter number) =30 contributing to the beta function, which quickly drives such a theory to strong coupling, unless the gauge coupling is really weak at the high scale. Assuming the corresponding gauge coupling is of the same order of magnitude as the GUT coupling ~1/25, at low energies such an unbroken E8 would be completely invisible.

The fact that SUSY as well as the GUT gauge group in the visible sector are broken at low energies does not mean that they don't get restored at the string scale and there is ample bottom-up evidence that that's what may actually happen.
 
Last edited:
  • #621
fzero said:
An anthropic argument isn't really about initial conditions. One typically resorts to the anthropic principle when one is ignorant of the initial conditions or can't describe the vacuum state in terms of initial data for one reason or another.

One way to use the anthropic principle is the following. Imagine that we have a space of effective theories parameterized by some numbers p_i, which could typically involve coupling constants, but could be generalized to ranks of gauge groups, number of matter generations, etc. A theory is a point \vec{p} in the space of these coupling constants. If there is some energy function on the space of couplings, then one could refer to the space of absolute minima as the space of vacua.

Anthropic arguments place bounds on the p_i such that p^{(-)}_i < p_i < p^{(+)}_i. The reasoning would generally be that if a parameter was out of the specified range, the universe could not have the features that it does. For example, in http://prl.aps.org/abstract/PRL/v59/i22/p2607_1 Weinberg established an upper bound on the value of the cosmological constant from the requirement that gravitationally bound systems like galaxies were allowed to form. This is a relatively weak bound, while stronger bounds might be established in a general theory by requiring tight fits with the fine structure constant, electron mass, etc. Some values, like number of generations, would be fixed to specific values, rather than a range.

If the vacua depend on discrete values of the parameters (as in string theory), then it makes sense to ask what number of vacua lie within the anthropic bounds \mathcal{N}[(p^{(-)}_i ,p^{(+)}_i)]. In some sense, the success of the anthropic argument is reflected by the value of \mathcal{N}. One hopes that this number is small.

Initial conditions are only relevant if we have some way to compute the p_i in terms of the initial values p^{(0)}_i. In general this cannot be done for the vast majority of interesting string vacua.

Let's see, I was thinking pretty much along the same lines except that I thought that each "initial condition" would pick out one vacuum. If that were the case, would that be a qualitatively similar understanding of what "anthropic" means? Also, is it technically not the case that specifying initial conditions picks out a vacuum (ie. is it too naive to reason by Newtonian analogy where initial conditions pick out the solution)?
 
  • #622
atyy said:
Let's see, I was thinking pretty much along the same lines except that I thought that each "initial condition" would pick out one vacuum. If that were the case, would that be a qualitatively similar understanding of what "anthropic" means? Also, is it technically not the case that specifying initial conditions picks out a vacuum (ie. is it too naive to reason by Newtonian analogy where initial conditions pick out the solution)?

Yes, you can expect that an initial condition will evolve to a specific vacuum, but generally many different initial conditions can lead to the same vacuum. In the case of string theory, the space of couplings is equivalent to the space of scalar field configurations. There is an effective potential on this space and the vacua are the critical points. A generic initial condition will evolve by rolling down this potential into one of the vacua. Obviously a given vacuum will act as a sink for families of initial conditions. In general we don't know how to compute this process in interesting situations.

In a more generic situation, we will still have RG flows from initial couplings to the vacuum configuration, which could be an RG fixed point or just an infinite flow into the IR. Ignoring tunneling effects, a given initial condition will evolve to a unique vacuum.

You can apply anthropic reasoning to the initial conditions if you have some sort of reliable way to connect initial conditions to the vacuum. I was describing how to apply the anthropic principle when you cannot.
 
  • #623
suprised said:
Somehow the opposite of a sweet spot seems to be required (no obvious susy in lower dimensions, no unbroken E8's, etc). Apart from anthropic reasoning, which bypasses this point, there is AFAIK no mechanism or principle known that would drive the theory away from its comfortable sweet spot into the ugly messy non-susy real world we observe.

How about seeking en explanation in terms of interacting strings, where you consider both the structure of the nodes (compactified-part) and the long range communication channel (non-compactified spacetime).

Generically (not string specific) I expect that there should exists a logic along the lines where there exists an equilibrium point where the interactions are more stable with a givene balance between internal structure (node structure) and communication challen structure (large dimensions). So that the data flow from the input must support the internal complexity, or it will dissipate.

Then maybe the prediction could be a probability distribution on the theory space; corresponding to the actual population on theoretical theory space.

Has anyone tried that?

Probably then a first step would be to understand theory-theory interactions. Ie what it means in terms of inference on strings; by strings? i.e strings measured by other strings.

Isn't this a gigantic conceptual puzzle?

/Fredrik
 
  • #624
smoit said:
Well, actually one can significantly enhance the gauge symmetries by compactifying to lower dimensions.

Right, I remember that the largest gauge symmetries (Candelas et all) had a rank of several 100.000's. But apparently our world is not like that. Indeed hidden sectors would be a way around that, but since there isn't any concrete, experimental indication of these today, the idea that strings would seek a highly symmetric vacuum is not a very strong case right now, rather all concrete evidence points in the opposite direction.

One also would still need to find some principle which would drive the theory to such a vacuum, or at least have some kind of statistical measure; but this seems to be hard in the current formulation of the theory. This is where an off-shell formulation could potentially help.
 
  • #625
negru said:
But given some principle, or extra input, it could in principle predict the SM. And the same with string theory. I think it would be foolish to expect string theory to give all the Standard Model parameters without any extra input.

Yepp. The question is what this extra input is - a frozen historical accident, some mathematical principle buried deep in string theory, dynamics of some underlying theory, some kind of evolution process in the multiverse (perhaps we landed on some kind of highly structured attractor fixed point?), or some completely new unknown principle or natural law.

All these ideas have been debated since years so we can't expect to add anything new here. The point to recall this was the current topic of "wrong turns"; the potential "wrong turn" or misconception was the belief that since strings are "complete", we should expect to find the standard model to pop out naturally, if we just search carefully enough. I think that only few people share this belief today.
 
  • #626
negru said:
Note that the work on ads4/cft3 is in large part motivated by applications of the high spin side to closed string field theory. The usual string theory with ever increasing masses could be coming from something else...where some symmetry is unbroken and all states are massless.
Can you say more about this?
 
  • #627
mitchell porter said:
Can you say more about this?

Ok but note that it might just be a trivial speculation. About half the people I talked to said it might work half said it can't.

So in ads4/cft3 you have the vector model on one side and Vasiliev's higher spin theory on the other. This is an interacting, fully consistent theory, which contains an infinite tower of massless particles with increasing spin. In particular it contains gravity so it's technically a valid theory of QG. It evades the Weinberg-Witten theorem because it has an infinite number of particles. It's pretty complicated, doesn't have a Lagrangian formulation yet etc.

Now in string theory you also have an infinite tower of particles with spin, but they get heavier and heavier. So if you could find a way to make all the higher states massless, you essentially (should) get back Vasiliev's theory. The biggest problem with this (and I guess why most people don't like the connection) is that in string theory you also have a huge proliferation of the number of states as you go up in spin, so you can't do a one-to-one matching with Vasiliev. I do think that this is just a trivial issue though, you could always possibly find some consistent truncation or something.
 
  • #628
These Vasiliev theories are pretty amazing. Among the people trying to make a connection to string theory, I found Sezgin, Sundell, Xi Yin. It's claimed that every free field theory has a Vasiliev theory as its holographic dual, so there may be a "landscape vs swampland" issue - which of these Vasiliev theories exist as limits or as truncations of string theory. Xi Yin says the relevant limit is the tensionless string limit, which makes me think of the tensionless strings living on the worldvolume of the M5-brane... I'm still trying to assimilate this development!
 
  • #629
Right, I wasn't clear, you can do the tensionless limit, but that's when you run into the state counting issue. There might be some other way to go backwards - break some symmetry and give mass to your strings. Don't know how much this was investigated though, it's just a comment I've often heard or read in various papers.

Anyway finding a connection between Vasiliev and string theory would be a huge achievement.
 
  • #630
For decades, trying to unify the gravity with other forces. But how unify, if the thing does not exist? I suggest philosophers of science to think about it.
 
  • #631
jaquecusto said:
For decades, trying to unify the gravity with other forces. But how unify, if the thing does not exist? I suggest philosophers of science to think about it.

What can philosophers possibly add to this? The few times I discusssed with philosophers, I spent hours explaining them time dilatation, geometry of curved space-time, principles of GR but they understood nothing. But they kept having ideas about the "very nature" of space and time, etc.
 
  • #632
jaquecusto said:
For decades, trying to unify the gravity with other forces. But how unify, if the thing does not exist? I suggest philosophers of science to think about it.

One possible viewpoint is that - if we see the laws of nature, not as statements of what nature IS and WHY, but rather as statements of what we (or an observer genereally) can rationally EXPECT from "nature", and WHY - then all laws of nature (seen from human observers) are already unified by the scientific method.

Ie. all our knowledge of nature, including it's "laws" are the result of the inference we call the scientific method. All laws are inferred from feedback from nature (the observers environment).

But what's lacking is a consistent inferential picture that applies to ANY observer, at any scale. For example how is the inferential system, scaled from human perspective to atomic scale?

We know exactly how we humans have inferred the laws of nature. The question is how a proton does the same? Ie how does the proton "know" what laws to follow? Maybe it doesn't? What it's just doing some constrained random walk? Could the constraint be the correspondence to the laws?

In particular can one ask; is there an observer (at some scale or complexity) that are inabled to distinguish gravity from the other forces, by available inference? Then, there you have the unificiation.

So I'd say unification is the starting point. The question isn't how to unify the forces, the question from inference is how to separate them in a way that reproduces the broken symmetries we see. That's how I see it.

/Fredrik
 
  • #633
suprised said:
What can philosophers possibly add to this? The few times I discusssed with philosophers, I spent hours explaining them time dilatation, geometry of curved space-time, principles of GR but they understood nothing. But they kept having ideas about the "very nature" of space and time, etc.
:smile:
They sound charming. I'll bet the discussion was in German. Please tell us how to say "very nature" in whatever was the original language. I want to add this to my vocabulary.

Perhaps you should tell them that the very nature of space and time is nothing else than the measurements possible to make of the geometric relationships between events.

Or else you could hurl them into the river. For this to be possible there must be a river near where you live.
 
  • #634
Do you know Feynman's discusssion with philosophers regarding "the true nature of an electron"? As a warm-up and in order to understand the expectations of the philosophers he first asked a simple question regarding the "true nature of a brick". He never got to the point to discuss the electron as the philosophers were not able to finish the brick discussion ...
 
  • #635
tom.stoer said:
Do you know Feynman's discusssion with philosophers regarding "the true nature of an electron"? As a warm-up and in order to understand the expectations of the philosophers he first asked a simple question regarding the "true nature of a brick". He never got to the point to discuss the electron as the philosophers were not able to finish the brick discussion ...

Wonderful story! No, I hadn't heard it.
 
  • #636
tom.stoer said:
Do you know Feynman's discusssion with philosophers regarding "the true nature of an electron"? As a warm-up and in order to understand the expectations of the philosophers he first asked a simple question regarding the "true nature of a brick". He never got to the point to discuss the electron as the philosophers were not able to finish the brick discussion ...

I also know a little story about Feynman. After a conference, he angrily returned home and told his wife to remind him never to participate in discussions about gravity.:rolleyes:
 
  • #637
marcus said:
Wonderful story! No, I hadn't heard it.

Just read his book "surely you're joking, ..."
 
  • #638
Just a brief comment. I'm not sure if I was the only one making the distinction, but to me "philosophy of physics" is not really same thing as "philosophy of science".

Stuff like "true nature of electron" is not something I can imagine would be discussed within philosophy of science. An electron is not a fundamental concept there.

In PoS, one MIGHT raise questions such as "nature of knowledge" or "information", the problem of induction (that Popper so famously wrote a whole book about and still failed to solve satisfactory) etc.

/Fredrik
 
  • #639
negru said:
So in ads4/cft3 you have the vector model on one side and Vasiliev's higher spin theory on the other. ... It's pretty complicated, doesn't have a Lagrangian formulation yet etc.
Perhaps this is a stupid question: what does ít exactly mean to have a theory w/o Lagrangian formulation? What is the definition" of such a theory?
 
  • #640
http://www.ictp.it/media/101047/schwarzictp.pdf" regarding the M5-brane worldvolume theory.
 
Last edited by a moderator:
  • #641
mitchell porter said:
http://www.ictp.it/media/101047/schwarzictp.pdf" regarding the M5-brane worldvolume theory.

Why is weak coupling the classical limit? Is that the same as hbar->0 ?

Are notions of classical limits for the SCFT and the bulk different? I thought these SCFTs were supposed to have classical bulk gravity at large N. How is it that it wasn't/isn't apparent that the SCFTs have classical limits?
 
Last edited by a moderator:
  • #642
atyy said:
Why is weak coupling the classical limit? Is that the same as hbar->0 ?

Are notions of classical limits for the SCFT and the bulk different? I thought these SCFTs were supposed to have classical bulk gravity at large N. How is it that it wasn't/isn't apparent that the SCFTs have classical limits?
There are some fundamental issues of quantum field theory here for which I have only an intuitive understanding. However:

I believe any classical limit corresponds to some version of hbar->0. But there can be inequivalent ways to do this, so a single quantum theory can have several, inequivalent classical limits. In fact, this is one way, or even the best way, to understand the meaning of the dualities in string theory - one quantum theory, several classical limits - but the phenomenon of multiple classical limits already exists in QFT.

Suppose we work with path-integral quantization. Under certain circumstances, the path integral will be dominated by the classical solutions to the equations of motion (i.e. the gradient of the amplitude in the space of histories will be flat in the vicinity of the classical solutions, so amplitudes for histories which are "almost classical" and clustered around a single classical solution will additively reinforce each other and make a large contribution to the total amplitude), so you can approximate the quantum theory as the classical theory plus fluctuations. I might have supposed that a theory which is "always strongly coupled" is one without such flat regions in the space of histories, so you can't apply perturbation theory, but then Schwarz later comments that for the 6D SCFT, people are "focus[ing] on the equations of motion", so I'm not sure. But even if I can't rigorously see why strong coupling implies no formulation of the quantum theory in terms of an action functional, it seems sensible that weak coupling does imply perturbative tractability and the existence of a classical limit.

The original example of an "always strongly coupled" non-Lagrangian QFT would be the "E6" superconformal field theory constructed by Minahan and Nemaschansky in the late 1990s. This actually shows up as the worldvolume theory of a D3-brane near an E6 singularity in F-theory; Heckman and Vafa have a paper about the phenomenological implications. The "T_n" theories mentioned by Schwarz were discovered by Gaiotto in his paper "N=2 dualities"; there's a review http://arxiv.org/abs/0909.1327" . Like the MN E6 theory, the Tn theories can also be coupled to other fields, becoming sectors in a larger theory.

I need to think for a while about the various limits in AdS/CFT before I can answer the second set of questions.
 
Last edited by a moderator:
  • #643
Fully agreed to mitchell. Some extra comment:

In string dualities one can have the phenomenon that quantum corrections and classical geometrical quantities can be exchanged between different formulations. That is, in one duality frame you may encounter quantum corrections to certain quantities at arbitrary loop order (genus of world-sheet), and in another frame these same expressions arise from the classical geometry of the compactification manifold. So there is no absolute notion as to what is quantum and what is classical.
 
  • #644
I still do not understand what "always strongly coupled" or "semiclassical limit" and "no Lagrangian formulation" have to do with each other. Perhaps this has something to do with "quantization of a Lagrangian" or "perturbative treatment" .- which would be missleading.

In QCD you can write down a path integral based on quarks and gluons which is valid below Lambda-QCD and which does not require any (high energy) asymptotic freedom.
 
  • #645
Its basically as mitchell said. If the quantum theory has a classical regime it means that the integrand of the path integral is very strongly peaked around the minimum of the action. So the integrand is

e^{-\frac{S[\varphi]}{\hbar}


The action will be proportional to the inverse of the coupling
S \sim \frac{1}{g}
So for a classical regime to exist we need
g \to 0 i.e. weak coupling. You can see that this is equivalent to taking hbar to zero as well. Then

Z =\int D \varphi e^{-\frac{S[\varphi]}{\hbar}} \approx e^{-\frac{S[\varphi_{cl}]}{\hbar}
 
  • #646
Regarding atyy's second set of questions:

For the original example of AdS/CFT (stack of D3-branes), the string theory and the gauge theory are perturbatively calculable in different parameter ranges. http://www.springerlink.com/content/9p632240j7314480/" (section 3.5):
the string theory on AdS5 x S5 is currently only really calculable in the classical supergravity limit where gs << 1 (so no string loops) and ls >> R (so no alpha' corrections). In terms of YM parameters this means that N >> lambda >> 1, which is the planar ’t Hooft limit, but at strong ’t Hooft coupling. On the other hand, the YM theory is only under perturbative control at small lambda and finite N. A great deal of the power of Maldacena’s conjecture comes not just from the fact that it is an explicit realization of the AdS/CFT conjecture, but also that weak coupling on one side of the equivalence is strong coupling on the other.
Because one is varying several parameters here - rank of the gauge group (N), coupling constant (g, where gstring = gYang-Mills2) - or even their product - lambda, the 't Hooft coupling, is gstringN - it can be hard to keep track of the relations between these limits. But maybe the important conceptual question, for the present discussion, is whether the existence of a calculable framework on one side of the duality implies the existence of a "classical limit" on the other side of the duality. I suppose the answer is "yes", but to visualize or comprehend this limit, you have to use the variables on the other side of the duality.

So maybe the best initial answer to atyy's challenge - how can these SCFTs not have classical limits when they are dual to classical supergravity in the bulk - is that, yes, these SCFTs do have classical limits where they "shouldn't", but the only classical characterization of those limits is precisely in terms of the dual, bulk variables (which can all be defined by the right combinations of operators from the boundary theory). You wouldn't be able to see it if you were just looking at the "original" variables.

Now, returning to John Schwarz's talk, the three primordial examples of AdS/CFT are for D3-branes, M2-branes, and M5-branes. In every case, you have a stack of coincident branes with a worldvolume theory that is decoupled from space-time far from the branes (think of the causally disconnected regions that can show up in Penrose diagrams), and the worldvolume theory is equivalent to string theory in an AdS space. For strong gravitational back-reaction, such that the branes form an event horizon, AdS is the actual near-horizon geometry. For weak gravitational back-reaction, such that the branes are existing in flat space, the AdS space seems to exist as a manifestation of energy scale in the worldvolume theory, akin to Guifre Vidal's MERA construction (but this is one of the conceptual issues that is still being worked out). So whether you're at weak or strong 't Hooft coupling, AdS is there.

For D3-branes, I quoted Argyres on how perturbative limits exist in two distinct parameter ranges that lie on opposite sides of the duality. As Schwarz says, for some time it was believed that the worldvolume theories for M2- and M5-branes were non-Lagrangian, meaning that there was no perturbative formulation. Following the discussion in previous comments, this would have meant that the only classical limit for this quantum theory was in terms of bulk variables, like supergravity. But ABJM showed that the M2-brane worldvolume theory was a super-Chern-Simons theory, and the Chern-Simons level "k" was able to play the role of Yang-Mills coupling gYM, so there's a 't Hooft coupling for these theories, kN.

The remaining question is whether the M5-brane worldvolume theory also has a perturbative formulation, or whether it really is non-Lagrangian. ("Always strongly coupled" must mean that the parameter, which you might have wanted to use for a perturbation expansion, remains large at every energy scale - there's nothing like asymptotic freedom.) And this is the subject of ongoing research.
 
Last edited by a moderator:
  • #647
tom.stoer said:
Perhaps this is a stupid question: what does ít exactly mean to have a theory w/o Lagrangian formulation? What is the definition" of such a theory?

Some theories are just defined in terms of their scattering amplitudes, or S-matrix, without any lagrangian formulation. This is pretty familiar from d=2. For example, it has been claimed way back that the SU(2) WZW model with E7 modular invariant does not have any lagrangrian description. I am not sure whether this statement is still considered true or not, I didn't follow the lit.

The many dualities that have been discovered may in fact teach to think in this direction more seriously again. They imply that lagrangian formulations can be ambiguous and may blur the view to the essence of a theory, which is its scattering amplitudes.

A canonical example is given again by the Ising model: it can be realized in terms of free fermions, psi, or bosons, in terms of which psi= exp(i phi). The bosons are periodic, thus can be viewed as compactified dimension. But it can be misleading to give a deeper significance to this "extra dimension". If one would just study the S-Matrix, one would not fall into the trap of attributing a higher-than-deserved significance to a particular lagrangrian representation of the theory.
 
  • #648
Finbar said:
Its basically as mitchell said. If the quantum theory has a classical regime it means that the integrand of the path integral is very strongly peaked around the minimum of the action. So the integrand is

e^{-\frac{S[\varphi]}{\hbar}


The action will be proportional to the inverse of the coupling
S \sim \frac{1}{g}
So for a classical regime to exist we need
g \to 0 i.e. weak coupling. You can see that this is equivalent to taking hbar to zero as well. Then

Z =\int D \varphi e^{-\frac{S[\varphi]}{\hbar}} \approx e^{-\frac{S[\varphi_{cl}]}{\hbar}
This seems to be wrong! Look at gravity. Perhaps GR is indeed the classical limit of some (yet to be identified) theory of quantum gravity, i.e. h=0. But of course this is NOT equivalent with G=0; GR does exist at non-zero G and we are all happy with that.
 
  • #649
tom.stoer said:
This seems to be wrong! Look at gravity. Perhaps GR is indeed the classical limit of some (yet to be identified) theory of quantum gravity, i.e. h=0. But of course this is NOT equivalent with G=0; GR does exist at non-zero G and we are all happy with that.

Perhaps he is thinking of GR as a field on flat spacetime?
 
  • #650
Hopefully not!

GR is much more than that. There are indications that QG (quantized GR) could be consistent for G>0 but inconsistent for G=0. G=0 could be interesting for toy models but totally irrelevant for nature.
 
Back
Top