# Interview with a Mathematical Physicist: John Baez Part 1

We are proud to introduce you to Mathematical Physicist and PF member John Baez!

**Give us some background on yourself.**

I’m interested in all kinds of mathematics and physics, so I call myself a mathematical physicist. But I’m a math professor at the University of California in Riverside. I’ve taught here since 1989. My wife Lisa Raphals got a job here nine years later: among other things, she studies classical Chinese and Greek philosophy.

I got my bachelors’s degree in math at Princeton. I did my undergrad thesis on whether you can use a computer to solve Schrödinger’s equation to arbitrary accuracy. In the end, it became obvious that you can. I was really interested in mathematical logic, and I used some in my thesis—the theory of computable functions—but I decided it wasn’t very helpful in physics. When I read the magnificently poetic last chapter of Misner, Thorne and Wheeler’s *Gravitation*, I decided that quantum gravity was the problem to work on.

I went to math grad school at MIT, but I didn’t find anyone to work with on quantum gravity. So, I did my thesis on quantum field theory with Irving Segal. He was one of the founders of “constructive quantum field theory”, where you try to rigorously prove that quantum field theories make mathematical sense and obey certain axioms that they should. This was a hard subject, and I didn’t accomplish much, but I learned a lot.

I got a postdoc at Yale and switched to classical field theory, mainly because it was something I could do. On the side I was still trying to understand quantum gravity. String theory was bursting into prominence at the time, and my life would have been easier if I’d jumped onto that bandwagon. But I didn’t like it, because most of the work back then studied strings moving on a fixed “background” spacetime. Quantum gravity is supposed to be about how the geometry of spacetime is variable and quantum-mechanical, so I didn’t want a theory of quantum gravity set on a pre-existing background geometry!

I got a professorship at U.C. Riverside based on my work on classical field theory. But at a conference on that subject in Seattle, I heard Abhay Ashtekar, Chris Isham and Renate Loll give some really interesting talks on loop quantum gravity. I don’t know why they gave those talks at a conference on classical field theory. But I’m sure glad they did! I liked their work because it was background-free and mathematically rigorous. So I started work on loop quantum gravity.

Like many other theories, quantum gravity is easier in low dimensions. I became interested in how category theory lets you formulate quantum gravity in a universe with just 3 spacetime dimensions. It amounts to a radical new conception of space, where the geometry is described in a thoroughly quantum-mechanical way. Ultimately, space is a quantum superposition of “spin networks”, which are like Feynman diagrams. The idea is roughly that a spin network describes a virtual process where particles move around and interact. If we know how likely each of these processes is, we know the geometry of space.

Loop quantum gravity tries to do the same thing for full-fledged quantum gravity in 4 spacetime dimensions, but it doesn’t work as well. Then Louis Crane had an exciting idea: maybe 4-dimensional quantum gravity needs a more sophisticated structure: a “2-category”.

I had never heard of 2-categories. Category theory is about things and processes that turn one thing into another. In a 2-category we also have “meta-processes” that turn one *process* into another.

I became very excited about 2-categories. At the time I was so dumb I didn’t consider the possibility of 3-categories, and 4-categories, and so on. To be precise, I was more of a mathematical physicist than a mathematician: I wasn’t trying to develop math for its own sake. Then someone named James Dolan told me about n-categories! That was a real eye-opener. He came to U.C. Riverside to work with me. So I started thinking about n-categories in parallel with loop quantum gravity.

Dolan was technically my grad student, but I probably learned more from him than vice versa. In 1996 we wrote a paper called “Higher-dimensional algebra and topological quantum field theory”, which might be my best paper. It’s full of grandiose guesses about n-categories and their connections to other branches of math and physics. We had this vision of how everything fit together. It was so beautiful, with so much evidence supporting it, that we knew it had to be true. Unfortunately, at the time nobody had come up with a good *definition* of n-category, except for n < 4. So we called our guesses “hypotheses” instead of “conjectures”. In math a conjecture should be something utterly precise: it’s either true or not, with no room for interpretation.

By now, I think everybody more or less believes our hypotheses. Some of the easier ones have already been turned into theorems. Jacob Lurie, a young hotshot at Harvard, improved the statement of one and wrote a 111-page outline of a proof. Unfortunately he still used some concepts that hadn’t been defined. People are working to fix that, and I feel sure they’ll succeed.

Anyway, I kept trying to connect these ideas to quantum gravity. In 1997, I introduced “spin foams”. These are structures like spin networks, but with an extra dimension. Spin networks have vertices and edges. Spin foams also have 2-dimensional faces: imagine a foam of soap bubbles.

The idea was to use spin foams to give a purely quantum-mechanical description of the geometry of spacetime, just as spin networks describe the geometry of space. But mathematically, what we’re doing here is going from a category to a 2-category.

By now, there are a number of different theories of quantum gravity based on spin foams. Unfortunately, it’s not clear that any of them really work. In 2002, Dan Christensen, Greg Egan and I did a bunch of supercomputer calculations to study this question. We showed that the most popular spin foam theory at the time gave dramatically different answers than people had hoped for. I think we more or less killed that theory.

That left me rather depressed. I don’t enjoy programming: indeed, Christensen and Egan did all the hard work of that sort on our paper. I didn’t want to spend years sifting through spin foam theories to find one that works. And most of all, I didn’t want to end up as an old man still not knowing if my work had been worthwhile! To me n-category theory was clearly the math of the future—and it was easy for me to come up with cool new ideas in that subject. So, I quit quantum gravity and switched to n-categories.

But this was very painful. Quantum gravity is a kind of “holy grail” in physics. When you work on that subject, you wind up talking to lots of people who believe that unifying quantum mechanics and general relativity is the most important thing in the world, and that nothing else could possibly be as interesting. You wind up believing it. It took me years to get out of that mindset.

Ironically, when I quit quantum gravity, I felt free to explore string theory. As a branch of math, it’s really wonderful. I started looking at how n-categories apply to string theory. It turns out there’s a wonderful story here: briefly, particles are to categories as strings are to 2-categories, and all the math of particles can be generalized to strings using this idea! I worked out a bit of this story with Urs Schreiber and John Huerta.

Around 2010, I felt I had to switch to working on environmental issues and math related to engineering and biology, for the sake of the planet. That was another painful renunciation. But luckily, Urs Schreiber and others are continuing to work on n-categories and string theory, and doing it better than I ever could. So I don’t feel the need to work on those things anymore—indeed, it would be hard to keep up. I just follow along quietly from the sidelines.

It’s quite possible that we need a dozen more good ideas before we really get anywhere on quantum gravity. But I feel confident that n-categories will have a role to play. So, I’m happy to have helped push that subject forward.

**Your uncle, Albert Baez, was a physicist. How did he help develop your interests?**

He had a huge effect on me. He’s mainly famous for being the father of the folk singer Joan Baez. But he started out in optics and helped invent the first X-ray microscope. Later he became very involved in physics education, especially in what were then called third-world countries. For example, in 1951 he helped set up a physics department at the University of Baghdad.

When I was a kid he worked for UNESCO, so he’d come to Washington D.C. and stay with my parents, who lived nearby. Whenever he showed up, he would open his suitcase and pull out amazing gadgets: diffraction gratings, holograms, and things like that. And he would explain how they work! I decided physics was the coolest thing there is.

When I was eight, he gave me a copy of his book *The New College Physics: A Spiral Approach*. I immediately started trying to read it. The “spiral approach” is a great pedagogical principle: instead of explaining each topic just once, you should start off easy and then keep spiraling from topic to topic, examining them in greater depth each time. So he not only taught me physics, he taught me about how to learn and how to teach.

Later, when I was fifteen, I spent a couple weeks at his apartment in Berkeley. He showed me the Lawrence Hall of Science, which is where I got my first taste of programming—in BASIC, with programs stored on paper tape. This was in 1976. He also gave me a copy of *The Feynman Lectures on Physics*. And so, the following summer, when I was working at a state park building trails and such, I was also trying to learn quantum mechanics from the third volume of *The Feynman Lectures*. The other kids must have thought I was a complete geek—which of course I was.

**Give us some insight on what your average work day is like.**

During the school year I teach two or three days a week. On days when I teach, that’s the focus of my day: I try to prepare my classes starting at breakfast. Teaching is lots of fun for me. Right now I’m teaching two courses: an undergraduate course on game theory and a graduate course on category theory. I’m also running a seminar on category theory. In addition, I meet with my graduate students for a four-hour session once a week: they show me what they’ve done, and we try to push our research projects forward.

On days when I don’t teach, I spend a lot of time writing. I love blogging, so I could easily do that all day, but I try to spend a lot of time writing actual papers. Any given paper starts out being tough to write, but near the end it practically writes itself. At the end, I have to tear myself away from it: I keep wanting to add more. At that stage, I feel an energetic glow at the end of a good day spent writing. Few things are so satisfying.

During the summer I don’t teach, so I can get a lot of writing done. I spent two years doing research at the Centre of Quantum Technologies, which is in Singapore, and since 2012 I’ve been working there during summers. Sometimes I bring my grad students, but mostly I just write.

I also spend plenty of time doing things with my wife, like talking, cooking, shopping, working out at the gym. We like to watch TV shows in the evening, mainly mysteries and science fiction.

We also do a lot of gardening. When I was younger that seemed boring—but as you get older, subjective time speeds up, so you pay more attention to things like plants growing. There’s something tremendously satisfying about planting a small seedling, watching it grow into an orange tree, and eating its fruit for breakfast.

I love playing the piano and recording electronic music, but doing it well requires big blocks of time, which I don’t always have. Music is pure delight, and if I’m not listening to it I’m usually composing it in my mind.

If I gave in to my darkest urges and becames a decadent wastrel I might spend all day blogging, listening to music, recording music and working on pure math. But I need other things to stay sane.

**What research are you working on at the moment?**

Lately I’ve been trying to finish a paper called “Struggles with the Continuum”. It’s about the problems physics has with infinities, due to the assumption that spacetime is a continuum. At certain junctures this paper became psychologically difficult to write, since it’s supposed to include a summary of quantum field theory, which is complicated and sprawling subject. So, I’ve resorted to breaking this paper into blog articles and posting them on Physics Forums, just to motivate myself.

Purely for fun, I’ve been working with Greg Egan on some projects involving the octonions. The octonions are a number system where you can add, subtract, multiply and divide. Such number systems only exist in 1, 2, 4, and 8 dimensions: you’ve got the real numbers, which form a line, the complex numbers, which form a plane, the quaternions, which are 4-dimensional, and the octonions, which are 8dimensional. The octonions are the biggest, but also the weirdest. For example, multiplication of octonions violates the associative law: ##(xy)z## is not equal to ##x(yz)##. So the octonions sound completely crazy at first, but they turn out to have fascinating connections to string theory and other things. They’re pretty addictive, and if became a decadent wastrel I would spend a lot more time on them.

There’s a concept of “integer” for the octonions, and integral octonions form a lattice, a repeating pattern of points, in 8 dimensions. This is called the E_{8} lattice. There’s another lattices that lives in in 24 dimensions, called the “Leech lattice”. Both are connected to string theory. Notice that 8+2 equals 10, the dimension superstrings like to live in, and 24+2 equals 26, the dimension bosonic strings like to live in. That’s not a coincidence! The 2 here comes from the 2-dimensional world-sheet of the string.

Since 3×8 is 24, Egan and I became interested in how you could built the Leech lattice from 3 copies of the E_{8} lattice. People already knew a trick for doing it, but it took us a while to understand how it worked—and then Egan showed you could do this trick in exactly 17,280 ways! I want to write up the proof. There’s a lot of beautiful geometry here.

There’s something really exhilarating about struggling to reach the point where you have some insight into these structures and how they’re connected to physics.

My main work, though, involves using category theory to study networks. I’m interested in networks of all kinds, from electrical circuits to neural networks to “chemical reaction networks” and many more. Different branches of science and engineering focus on different kinds of networks. But there’s not enough communication between researchers in different subjects, so it’s up to mathematicians to set up a unified theory of networks.

I’ve got seven grad students working on this project—or actually eight, if you count Brendan Fong: I’ve been helping him on his dissertation, but he’s actually a student at Oxford.

Brendan was the first to join the project. I wanted him to work on electrical circuits, which are a nice familiar kind of network, a good starting point. But he went much deeper: he developed a general category-theoretic framework for studying networks. We then applied it to electrical circuits, and other things as well.

Blake Pollard is a student of mine in the physics department here at U. C. Riverside. Together with Brendan and me, he developed a category-theoretic approach to Markov processes: random processes where a system hops around between different states. We used Brendan’s general formalism to reduce Markov processes to electrical circuits. Now Blake is going further and using these ideas to tackle chemical reaction networks.

My other students are in the math department at U. C. Riverside. Jason Erbele is working on “control theory”, a branch of engineering where you try to design feedback loops to make sure processes run in a stable way. Control theory uses networks called “signal flow diagrams”, and Jason has worked out how to understand these using category theory.

Jason isn’t using Brendan’s framework: he’s using a different one, called PROPs, which were developed a long time ago for use in algebraic topology. My student Franciscus Rebro has been developing it further, for use in our project. It gives a nice way to describe networks in terms of their basic building blocks. It also illuminates the similarity between signal flow diagrams and Feynman diagrams! They’re very similar, but there’s a big difference: in signal flow diagrams the signals are classical, while Feynman diagrams are quantum-mechanical.

My student Brandon Coya has been working on electrical circuits. He’s sort of continuing what Brendan started, and unifying Brendan’s formalism with PROPs.

My student Adam Yassine is starting to work on networks in classical mechanics. In classical mechanics you usually consider a single system: you write down the Hamiltonian, you get the equations of motion, and you try to solve them. He’s working on a setup where you can take lots of systems and hook them up into a network.

My students Kenny Courser and Daniel Cicala are digging deeper into another aspect of network theory. As I hinted earlier, a category is about things and processes that turn one thing into another. In a 2-category we also have “meta-processes” that turn one process into another. We’re starting to bring 2-categories into network theory.

For example, you can use categories to describe an electrical circuit as a process that turns some inputs into some outputs. You put some currents in one end and some currents come out the other end. But you can also use 2-categories to describe “meta-processes” that turn one electrical circuit into another. An example of a meta-process would be a way of simplifying an electrical circuit, like replacing two resistors in series by a single resistor.

Ultimately I want to push these ideas in the direction of biochemistry. Biology seems complicated and “messy” to physicists and mathematicians, but I think there must be a beautiful logic to it. It’s full of networks, and these networks change with time. So, 2-categories seem like a natural language for biology.

It won’t be easy to convince people of this, but that’s okay.

**Continue to Part 2 of this interview**

I’m a mathematical physicist. I work at the math department at U. C. Riverside in California, and also at the Centre for Quantum Technologies in Singapore. I used to do quantum gravity and n-categories, but now I mainly work on network theory and the Azimuth Project, which is a way for scientists, engineers and mathematicians to do something about the global ecological crisis.

Brilliant Insight into your life John! Are you planning to returning to the "Centre of Quantum Technologies" this year? Also I must imagine you have the most interesting dinner discussions with your wife. Philosophy and Physics :)

Thanks for giving me the chance to think about where I've been and where I'm going!I'll go back to the Centre for Quantum Technologies again at the end of June. My wife has been working at the Philosophy Department at NUS while I've been working there. We think this will be our last summer working in Singapore: we've had a good run of it, but we're ready to try something new. Frankly I'd be happy to stay home and tend to our garden—I always get nervous about the plants when I'm gone during the hot summers in Riverside! But I suspect we'll probably go to Europe next summer.

This article uses an E8 projection which I introduced to Wikipedia in Feb of 2010 here. Technically, it is E8 projected to the E6 Coxeter plane.The projection uses X Y basis vectors of:X = {-Sqrt[3] + 1, 0, 1, 1, 0, 0, 0, 0};Y = {0, Sqrt[3] – 1, -1, 1, 0, 0, 0, 0};Resulting in vertex overlaps of:24 Red with 1 overlap24 Orange each with 8 overlaps (192 vertices)1 Yellow with 24 overlaps (24 vertices)This was subsequently recreated in the current article and 4_21 E8 WP page by Tom Ruen, source for the article.After doing this for a few example symmetries, Tom took my idea of projecting higher dimensional objects to the 2D (and 3D) symmetries of lower dimensional subgroups – and ran with it in 2D – producing a ton of visualizations across WP. :-)http://theoryofeverything.org/theToE/2016/03/16/e8-in-e6-petrie-projection/

I'm not sure, but I think Shyan's question was an attempt at humor (I laughed anyway). Yet, if he was asking a serious, the reason is that John was describing the conditions for "normed division algebras". The 1,2,4 and 8 dimensions are due to the 2^n nature of the Cayley-Dickson construction of the algebras.Add to this, given the reference to "division" algebra, ends at dimension 8 with octonions because the next level up (the 16 dimensional sedenions) contains "zero divisors" which prevents division in those cases. For my list of those zeros, see: https://en.wikipedia.org/wiki/Sedenion

Hmm, biochemistry. How about neurobiology? There seem to be networks there too.

I think it's awesome that we have some prominent scientists and mathematicians here on PF.

Do you by any chance know Helmer Aslaksan? I would think so.

No, I don’t know him. He[URL=’http://www.math.nus.edu.sg/aslaksen/’] seems like a cool guy[/URL], but I haven’t been hanging out in the math department at NUS.

Cool! Maybe you mean E6 projection? I deliberately chose an image that was [URL=’https://en.wikipedia.org/wiki/File:4_21_t0_E6.svg’]in the public domain[/URL] so I wouldn’t have to give attributions, but thanks for helping come up with it!

The 72 vertex and 720 edge E6 (as a subgroup of E8) in the E6 Coxeter plane (a Petrie projection) produces vertex overlaps of:

24 Yellow with 1 overlap

24 Orange each with 2 overlaps (48 vertices)

with none at the origin.

(which I have now included here:

[URL]http://theoryofeverything.org/theToE/2016/03/16/e8-in-e6-petrie-projection/[/URL])

See this work by Stembridge on this as well: [URL]http://www.math.lsa.umich.edu/~jrs/coxplane.html[/URL]

Taking all 240 of E8 vertices and 6720 edges produces a similar result but different (as you used in the article).

BTW – great article!

oh, and a minor point… it is my understanding that citation (author name and link if online) is still required to WP WikiMedia commons content. (of course, in this case the work was Tom Ruen’s – he simply took my basis vectors and produced the image using his own source code (rather than using the Mathematica tool I use and offer in the public domain).

When someone puts something in the public domain, that means anyone can do anything they want with it. If you read [the page for this image on Wikicommons]([URL]https://en.wikipedia.org/wiki/File:4_21_t0_E6.svg[/URL]) you’ll see it says:

I believe someone puts something in the public domain, that means anyone can do anything they want with it. If you read [URL=’https://en.wikipedia.org/wiki/File:4_21_t0_E6.svg’]the page on which this image appears on Wikicommons[/URL] you’ll see it says:

One reason I like Tom Ruen’s work so much is that he puts it into the public domain, thus freeing it up for worldwide use without any need for attribution. It’s a gift to the universe.

But of course, if Mr. X tries to put Mr. Y’s copyrighted work into the public domain, Mr. Y can argue with that.

I stand corrected – it seems Tom does (at least for some images) change the default file upload CC Share-Alike Commons to “Public Domain”.

In checking though, he does have others that have the default (as, AFAIK, mine do) which does require attribution. So I guess, we need to double check each image.

Don’t get me wrong, I am ok w/putting my stuff in public domain w/o attribution requirements. In using the defaults I assumed proper etiquette was to cite WP sources as a matter of course.

e.g. [URL]https://commons.wikimedia.org/wiki/File:Flower_of_life_triangular_11547-arccircle.svg[/URL]

[SIZE=5][B][/B]

Licensing[/SIZE]I, the copyright holder of this work, hereby publish it under the following license:This file is licensed under the [URL=’https://en.wikipedia.org/wiki/en:Creative_Commons’]Creative Commons[/URL] [URL=’https://creativecommons.org/licenses/by-sa/4.0/deed.en’]Attribution-Share Alike 4.0 International[/URL] license.

You are free:

[LIST]

[*]

to share– to copy, distribute and transmit the work[*]

to remix– to adapt the work[/LIST]

Under the following conditions:

[LIST]

[*]

attribution– You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work).[*]

share alike– If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one.[/LIST]

By the way, thanks for explaining how many vertices overlap in this 2d projection of the E8 root polytope!

I remember reading somewhere that this is somehow because of the fact that [URL=’https://en.wikipedia.org/wiki/Hairy_ball_theorem’]”every cow must have at least one cowlick”[/URL], but there wasn’t much explanation! Can anybody give a clue?

Also, interesting interview, thanks!

It has nothing to do with it. The “cow” theorem is about poles in odd dimensional spaces.

Here’s the relation. A sphere in n-dimensional space can have least one continuous nowhere vanishing vector field if and only if n = 2,4,6,8,… A sphere in n-dimensional space can have (n-1) linearly independent continuous vector fields if n = 1, 2, 4, or 8.

People know, for a sphere of any dimension, the maximum number of linearly independent continuous vector fields on it. See:

[LIST]

[*][URL=’https://en.wikipedia.org/wiki/Vector_fields_on_spheres’]Vector fields on spheres[/URL], Wikipedia.

[/LIST]

These results subsume everything I just said, and more.

You’re welcome!

My concept of biology is so broad that it includes biochemistry, neurobiology, ecology and more. The network formalisms I’m developing are so general that they should have some relevance to all of these topics… though of course it’d take expertise to develop any one particular application to the point of doing something useful!

I hope I’m not beating a dead cow here, but….

and

and

The “relation” is as John pointed out in the WP reference, yet the “cause and effect” seem to be reversed in Shyan’s comment. That is: the 1,2,4 and 8 limit on the dimensionality of numbers systems where you can add, subtract, multiply and divide is not “because of” the number of linearly independent continuous vector fields (or their number of hairy ball cowlicks).

It is the octonion limit on the number of dimensions (n=8) for normed division algebras (due to sedenions having zero divisors and the Cayley-Dickson construction) that constrains n<9. The number of linearly independent continuous vector fields within the modulo 8 periodicity of Clifford Algebras are related yet dependent rather than being a cause of any lack of "normed division" capability.

Please ! Tell me your opinion about it.Hypercomplex_numbers_and_their_application

https://drive.google.com/open?id=0B6SDyydw3n_oZnNhZG00WkJlNXc