Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Insights Why Higher Category Theory in Physics? - Comments

  1. Mar 27, 2018 #21

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    For readers interested in the topic of the above PF Insights:

    The British Physics Research Council and the London Mathematical Socienty are funding a Symposium on
    Higher Structures in M-Theory
    this August at Durham.

    Unfortunately there are no funds left to support travel or accomodation, but if you are interested and have means to get there, you should be welcome.


    poster109.jpg
     
  2. Apr 3, 2018 #22

    Fra

    User Avatar

    For someone not actively into this, from that angle, the insight article is dense containing references to plenty of more dense papers, but even so i really enjoy seeing Urs passion and red line of argument! I am sure this is not effortlessly conveyed to physicists that does not have the strong mathematical inclination that Urs has.

    While i cant claim to have digested the insight article in any detail, i just make associations from my perspective when trying to understand Urs suggestion that the mathematical tools for understanding the theory of theories we end up with in physics. I don't know what Urs things about these associations, but they arent mathematical statements, they are just supposedly a conceptual bridge to the higher category abstractions and the abstractions of spontaneous information processing that is use.

    So Urs, does my association make any sense from your formal technical perspective? In order not to derail your thread here, i will keep the comments minimal.

    In my view, there are something we can call "objects" that corresponds to physical structures and encode information about their own environment. we can call these also information processing agents. We can also associate them to matter.

    Then the set of these objects can morph into other objects, in the same we we recode information. In these transformations, anything can change, even topology.

    Without going into details, loosely speaking this is the basis for a category, right? Then if we consider that the set of morphisms (which can be understood as a set of possible computation programs) are acutally evolving, and are restricted by the strucutre of the objects, we here get a higher category as the morphisms themselves are resulting from another process. (I see thigs starting from permutations of discrete states), and the set of morphisms get richer and richer the more complex the objects get (getting bigger information capacity or mass generation).

    What i envision here is that the "order" of the categories will essentiall be a dynamical process, except one without external description, so its better described as an evolutionary process.

    These are abstractions i am working with as well, but to be honest i do not yet know which mathematics in the end that will end up be the right thing. And i see at least a possibility that this can be describe as higher categories as well. But unlike you, i am not convinced that the mathematical angle itselt is the right "guide". My own internal guidance is much more intuitive, and based on a vision of interacting "computer codes". But it might well be that this converges to something that is characterised by higer cateogories.

    /Fredrik
     
  3. Apr 3, 2018 #23

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    The modern theory of computation is secretly essentially the same as category theory. This remarkable confluence has been called computational trinitarianism.

    This has more recently been reinforced by the understanding that the foundations of computation is in fact a foundation for homotopy theory; this insight is now known as homotopy type theory.

    I recommend this introduction:
     
  4. Apr 3, 2018 #24

    Fra

    User Avatar

    Thanks, I will try to get around to check that paper.

    Just to take a huge jump ahead of things - regardless of the value of a developed branch of mathematics that can bring some order and structure into some of the crazy things that is going on at the foundations of theoretical physics, i mainly wonder if once the correspondence is established, wether there is a nice wealth of theorems etc, that can immediately be "translated" into conjectures about the marriage of the standard model of particle physics which IMO "lives" in and is dependent on a rigid classical frame of reference, and the inside vides that are implies either by earh based cosmological theories, or the "inside views" implicit in understand how forces are unified at high energies?

    For example, would the mathematical machinery of higher category theory, provide a physicist with any brilliant shortcuts to understand unification?
    etc. I somehow would not expect things to be that nice, but perhaps you see this differentlty?

    /Fredrik
     
  5. Apr 3, 2018 #25

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    There are theorems that indicate that this is the case. I must have mentioned this before:
     
    Last edited: Apr 3, 2018
  6. Apr 4, 2018 #26

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

  7. Apr 4, 2018 #27

    Fra

    User Avatar

    As I am not a fan of getting lost in details, before you know youre in the right forest, may i ask you another "way ahead of things" question. (Maybe the answer is in your references though)

    There is no question that i see the abstraction here, where one can describe theories, relations between theories, and theories about theories in a more abstract way of higher categories. But in my view, and an terms of the computational picture of interacting "information processing agents", the computational capacity and memory capacity must put constraints on how complex the "theory of theory" can be, and still be computable. After all, from the point of view of an information processing agent, simply trying to survive in the black box environment, an algorithm that is too complex to run (and cant be scaled down) to the limited hardware in question is useless, it will get the agent "killed".

    What i am trying to say is, what prevents this n-category from just inflating into a turtle tower of infinitity-category? And how do you attach such complexiy to experimental contact? After all, this is what i see as the problem so far. You can ALWAYS inflat and theory, and create a bigger theory. But my hunch is that there is a physical cutoff (relating to observers mass) that must fix the maximum complexity here, and thus for any specific case, we should find some kind of maximal n?

    Now, does the n-cat machinery really provide any insight to THIS point?

    /Fredrik
     
  8. Apr 5, 2018 #28

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    Remarkably, homotopy theory is a "smaller theory": it arises from classical theory by removing axioms from classical logic. This is the fantastic insight of homotopy type theory.

    The story goes like so: In the 70s logician Per Martin-Löf comes up with the modern foundation of computer science, now known as intuitionistic type theory (where "type" is short for "data type", such as "Boolean" or "Integer".) This type theory is an absolute minimum of logic, as Martin-Löf lays out very enjoyably here:

    Per Martin-Löf,
    "On the Meanings of the Logical Constants and the Justifications of the Logical Laws",
    Nordic Journal of Philosophical Logic, 1(1): 11–60, 1996,
    (pdf)

    Moreover, this type theory is fully constructive, meaning roughly that it regards everything as a computer program.

    In particular it thus regards assertions of equality as computer programs: The assertion ##x = y## is to be understood as a computer program ##\gamma_1 ## which checks that indeed ##x## equals ##y##. This brings with it the curious possibility that there can be another program ##\gamma_2## which also proves that ##x## equals ##y##.

    First, Martin-Löf distrusts his own theory, feeling that this would be weird, and imposes the ad hoc extra rule (the axiom of uniqueness of identity proofs) saying that any two such computer programs, proving ##x = y## must in fact be equal. But eventually it is realized that such an ad hoc rule is awkward and breaks various nice properties of the system. Hence it is removed again, sticking with the minimal theory.

    But this minimal theory now has a maximally rich behaviour: To ask whether the two programs ##\gamma_1## and ##\gamma_2## are equal or not, one now needs to invoke yet another program ##\kappa## which checks ##\gamma_1 = \gamma_2##.

    2Cell.jpg

    And again there may be two different such programs, ##\kappa_1## and ##\kappa_2##, and to tell whether they are equal, we need to invoke yet another computer program

    3Cell.jpg

    And so ever on. (Graphics taken from Higher Structures in Mathematics and Physics.)

    For decades nobody in computer science new what to make of this. Then suddenly there was a little revolution, when it was realized that this minimal theory of computation with this curious rich inner structure is secretly the formal computer language for homotopy theory and higher category theory: It automatically regards data types as homotopy types. Ever since, Per Martin-Löf's type theory is now called homotopy type theory.

    This a new foundation of mathematics rooted in computer science and flourishing into higher category theory. It may be argued that it also provides foundations for modern physics, see at Modern Physics formalized in Modal Homotopy Type Theory.
     
    Last edited: Apr 5, 2018
  9. Apr 5, 2018 #29

    Fra

    User Avatar

    It is indeed interesting to note how what you know says in itself is an abstraction that is analogous to the observer problem! It is a totally different but intuitive way to raise questions that just by replacing labels look similar to yours. This is cool indeed!

    For example: observer equivalence (beeing at heart of physical gauge theory) may ask. How do we prove that observers inferences (which indeed i viee as special computations) are as consistent as we think they "must be"? This imo hits at the heart of the problems of fundamentaö physics. Like the reaction of Per some are tempting to put this as an axiom! Which translates to saying there must be observer invariant eternal physical laws.

    However i think, this is a fallacy and it is remarkably analogoua to what smolin calls cosmological fallacy.

    The resolution is that the only way to "compare" inferences between two observers is to let them interact AND have a third observer to judge.

    Thanks for enriching the forum wich great things!
    /Fredrik
     
  10. Apr 7, 2018 #30
  11. Apr 7, 2018 #31

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Well, larger versus smaller or simpler versus more complex depends on how you measure things. You can understand classical mathematics as being a simplification of constructive mathematics, in which you toss out distinctions. (such as the distinction between a proof of ##A## and a proof of ##\neg \neg A##)
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Loading...