This Week's Finds in Mathematical Physics (Week 236)

In summary, This Week's Finds in Mathematical Physics (Week 236) by John Baez discusses some papers about categorification and quantum mechanics. The conversation also delves into the topic of ordinals and how they represent different levels of infinity. The concept of well-ordered sets and their corresponding ordinals is explained, leading up to the first infinite ordinal, omega. The conversation then goes on to discuss higher levels of ordinals, including omega squared and omega to the power of omega. The conversation ends with a comparison to driving across South Dakota and the seemingly never-ending ordinals.
  • #1
John Baez
[SOLVED] This Week's Finds in Mathematical Physics (Week 236)

Also available at http://math.ucr.edu/home/baez/week236.html

July 26, 2006
This Week's Finds in Mathematical Physics (Week 236)
John Baez

This week I'd like to catch you up on some papers about
categorification and quantum mechanics.

But first, since it's summer vacation, I'd like to take you on
a little road trip - to infinity. And then, for fun, a little
detective story about the history of the icosahedron.

Cantor invented two kinds of infinities: cardinals and ordinals.
Cardinals are more familiar. They say how big sets are. Two sets
can be put into 1-1 correspondence iff they have the same number of
elements - where this kind of "number" is a cardinal.

But today I want to talk about ordinals. Ordinals say how big
"well-ordered" sets are. A set is well-ordered if it's linearly
ordered and every nonempty subset has a smallest element.

For example, the empty set

{}

is well-ordered in a trivial sort of way, and the corresponding
ordinal is called

0.

Similarly, any set with just one element, like this:

{0}

is well-ordered in a trivial sort of way, and the corresponding
ordinal is called

1.

Similarly, any set with two elements, like this:

{0,1}

becomes well-ordered as soon as we decree which element is bigger;
the obvious choice is to say 0 < 1. The corresponding ordinal is
called

2.

Similarly, any set with three elements, like this:

{0,1,2}

becomes well-ordered as soon as we linearly order it; the obvious
choice here is to say 0 < 1 < 2. The corresponding ordinal is called

3.

Perhaps you're getting the pattern - you've probably seen these
particular ordinals before, maybe sometime in grade school.
They're called finite ordinals, or "natural numbers".

But there's a cute trick they probably didn't teach you then:
we can *define* each ordinal to *be* the set of all ordinals
less than it:

0 = {} (since no ordinal is less than 0)
1 = {0} (since only 0 is less than 1)
2 = {0,1} (since 0 and 1 are less than 2)
3 = {0,1,2} (since 0, 1 and 2 are less than 3)

and so on. It's nice because now each ordinal *is* a
well-ordered set of the size that ordinal stands for.
And, we can define one ordinal to be "less than or equal" to
another precisely when its a subset of the other.

Now, what comes after all the finite ordinals? Well,
the set of all finite ordinals is itself well-ordered:

{0,1,2,3,...}

So, there's an ordinal corresponding to this - and it's the first
*infinite* ordinal. It's usually called omega. Using the cute
trick I mentioned, we can actually define

omega = {0,1,2,3,...}

Now, what comes after this? Well, it turns out there's a
well-ordered set

{0,1,2,3,...,omega}

containing the finite ordinals together with omega, with the
obvious notion of "less than": omega is bigger than the rest.
Corresponding to this set there's an ordinal called

omega+1

As usual, we can simply define

omega+1 = {0,1,2,3,...,omega}

(At this point you could be confused if you know about cardinals,
so let me throw in a word of reassurance. The sets omega and
omega+1 have the same "cardinality", but they're different as
ordinals, since you can't find a 1-1 and onto function between
them that *preserves the ordering*. This is easy to see, since
omega+1 has a biggest element while omega does not.)

Now, what comes next? Well, not surprisingly, it's

omega+2 = {0,1,2,3,...,omega,omega+1}

Then comes

omega+3, omega+4, omega+5,...

and so on. You get the idea.

What next?

Well, the ordinal after all these is called omega+omega.
People often call it "omega times 2" or "omega 2" for short. So,

omega 2 = {0,1,2,3,...,omega,omega+1,omega+2,omega+3,...}

What next? Well, then comes

omega 2 + 1, omega 2 + 2,...

and so on. But you probably have the hang of this already, so
we can skip right ahead to omega 3.

In fact, you're probably ready to skip right ahead to omega 4,
and omega 5, and so on.

In fact, I bet now you're ready to skip all the way to
"omega times omega", or "omega squared" for short:

omega^2 =

{0,1,2...omega,omega+1,omega+2,...,omega2,omega2+1,omega2+2,...}

It would be fun to have a book with omega pages, each page half
as thick as the previous page. You can tell a nice long story
with an omega-sized book. But it would be even more fun to have
an encyclopedia with omega volumes, each being an omega-sized book,
each half as thick as the previous volume. Then you have omega^2
pages - and it can still fit in one bookshelf!

What comes next? Well, we have

omega^2+1, omega^2+2, ...

and so on, and after all these come

omega^2+omega, omega^2+omega+1, omega^2+omega+2, ...

and so on - and eventually

omega^2 + omega^2 = omega^2 2

and then a bunch more, and then

omega^2 3

and then a bunch more, and then

omega^2 4

and then a bunch more, and more, and eventually

omega^2 omega = omega^3.

You can probably imagine a bookcase containing omega encyclopedias,
each with omega volumes, each with omega pages, for a total of
omega^3 pages.

I'm skipping more and more steps to keep you from getting bored.
I know you have plenty to do and can't spend an *infinite* amount
of time reading This Week's Finds, even if the subject is infinity.

So, if you don't mind me just mentioning some of the high points,
there are guys like omega^4 and omega^5 and so on, and after all
these comes

omega^omega.

And then what?

Well, then comes omega^omega + 1, and so on, but I'm sure
that's boring by now. And then come ordinals like

omega^omega 2,..., omega^omega 3, ..., omega^omega 4, ...

leading up to

omega^omega omega = omega^{omega + 1}

Then eventually come ordinals like

omega^omega omega^2 , ..., omega^omega omega^3, ...

and so on, leading up to:

omega^omega omega^omega = omega^{omega + omega} = omega^{omega 2}

This actually reminds me of something that happened driving across
South Dakota one summer with a friend of mine. We were in college,
so we had the summer off, so we drive across the country. We drove
across South Dakota all the way from the eastern border to the west
on Interstate 90.

This state is huge - about 600 kilometers across, and most of it is
really flat, so the drive was really boring. We kept seeing signs
for a bunch of tourist attractions on the western edge of the state,
like the Badlands and Mt. Rushmore - a mountain that they carved
to look like faces of presidents, just to give people some reason to keep
driving.

Anyway, I'll tell you the rest of the story later - I see some more
ordinals coming up:

omega^{omega 3},... omega^{omega 4},... omega^{omega 5},...

We're really whizzing along now just to keep from getting bored - just
like my friend and I did in South Dakota. You might fondly imagine
that we had fun trading stories and jokes, like they do in road movies.
But we were driving all the way from Princeton to my friend Chip's
cabin in California. By the time we got to South Dakota, we were all
out of stories and jokes.

Hey, look! It's

omega^{omega omega} = omega^{omega^2}

That was cool. Then comes

omega^{omega^3}, ... omega^{omega^4}, ... omega^{omega^5}, ...

and so on.

Anyway, back to my story. For the first half of our half of our
trip across the state, we kept seeing signs for something called
the South Dakota Tractor Museum.

Oh, wait, here's an interesting ordinal - let's slow down and
take a look:

omega^{omega^omega}

I like that! Okay, let's keep driving:

omega^{omega^omega} + 1, omega^{omega^omega} + 2, ...

and then

omega^{omega^omega} + omega, ..., omega^{omega^omega} + omega 2, ...

and then

omega^{omega^omega} + omega^2, ..., omega^{omega^omega} + omega^3, ...

and eventually

omega^{omega^omega} + omega^omega

and eventually

omega^{omega^omega} + omega^{omega^omega} = omega^{omega^omega} 2

and then

omega^{omega^omega} 3, ..., omega^{omega ^ omega} 4, ...

and eventually

omega^{omega^omega} omega = omega^{omega^omega + 1}

and then

omega^{omega^omega + 2}, ..., omega^{omega^omega + 3}, ...

This is pretty boring; we're already going infinitely fast,
but we're still just picking up speed, and it'll take a while
before we reach something interesting.

Anyway, we started getting really curious about this South Dakota
Tractor Museum - it sounded sort of funny. It took 250 kilometers
of driving before we passed it. We wouldn't normally care about
a tractor museum, but there was really nothing else to think about
while we were driving. The only thing to see were fields of grain,
and these signs, which kept building up the suspense, saying things
like "ONLY 100 MILES TO THE SOUTH DAKOTA TRACTOR MUSEUM!"

We're zipping along really fast now:

omega^{omega^{omega^omega}}, ... omega^{omega^{omega^{omega^omega}}},...

What comes after all these?

At this point we need to stop for gas. Our notation for ordinals
runs out at this point!

The ordinals don't stop; it's just our notation that gives out.
The set of all ordinals listed up to now - including all the ones
we zipped past - is a well-ordered set called

epsilon_0

or "epsilon-nought". This has the amazing property that

epsilon_0 = omega^{epsilon_0}

And, it's the smallest ordinal with this property.

In fact, all the ordinals smaller than epsilon_0 can be drawn as
trees. You write them in "Cantor normal form" like this:

omega^{omega^omega + omega} + omega^omega + omega + omega + 1 + 1 + 1

using just + and exponentials and 1 and omega, and then you turn
this notation into a picture of a tree. I'll leave it as a puzzle
to figure out how.

So, the set of (finite, rooted) trees becomes a well-ordered set
whose ordinal is epsilon_0. Trees are important in combinatorics
and computer science, so epsilon_0 is not really so weird after all.

Another cool thing is that Gentzen proved the consistency of the
usual axioms for arithmetic - "Peano arithmetic" - with the help
of epsilon_0. He did this by drawing proofs as trees, and using
this to give an inductive argument that there's no proof in Peano
arithmetic that 0 = 1. But, this inductive argument goes beyond
the simple kind you use to prove facts about all natural numbers.
It uses induction up to epsilon_0.

You can't formalize Gentzen's argument in Peano arithmetic: thanks
to Goedel, this system can't proof itself consistent unless it's *not*.
I used to think this made Gentzen's proof pointless, especially since
"induction up to epsilon_0" sounded like some sort of insane logician's
extrapolation of ordinary mathematical induction.

But now I see that induction up to epsilon_0 can be thought of as
induction on trees, and it seems like an obviously correct principle.
Of course Peano's axioms also seem obviously correct, so I don't know
that Gentzen's proof makes me *more sure* Peano arithmetic is
consistent. But, it's interesting.

Induction up to epsilon_0 also let's you prove other stuff you
can't prove with just Peano arithmetic. For example, it let's you
prove that every Goodstein sequence eventually reaches zero!

Huh?

To write down a Goodstein sequence, you start with any natural
number and write it in "recursive base 2", like this:

2^{2^2 + 1} + 2^1

Then you replace all the 2's by 3's:

3^{3^3 + 1} + 3^1

Then you subtract 1 and write the answer in "recursive base 3":

3^{3^3 + 1} + 1 + 1

Then you replace all the 3's by 4's, subtract 1 and write the
answer in recursive base 4. Then you replace all the 4's by
5's, subtract 1 and write the answer in recursive base 5. And so on.

At first these numbers seem to keep getting bigger! So, it seems
shocking at first that they eventually reach zero. For example,
if you start with the number 4, you get this Goodstein sequence:

4, 26, 41, 60, 41, 60, 83, 109, 139, 173, 211, 253, 299, 348, ...

and apparently it takes about 3 x 10^{60605351} steps to reach zero!
You can try examples yourself on this applet:

1) National Curve Bank, Goodstein's theorem,
http://curvebank.calstatela.edu/goodstein/goodstein.htm

But if you think about it the right way, it's obvious that every Goodstein
sequence *does* reach zero.

The point is that these numbers in "recursive base n" look a lot
like ordinals in Cantor normal form. If we translate them into
ordinals by replacing n by omega, the ordinals keep getting smaller
at each step, even when the numbers get bigger!

For example, when we do the translation

2^{2^2 + 1} + 2 |-> omega^{omega^omega + 1} + omega^1

3^{3^3 + 1} + 1 + 1 |-> omega^{omega^omega + 1} + 1 + 1

we see the ordinal got smaller even though the number got bigger.
Since epsilon_0 is well-ordered, the ordinals must bottom out at zero
after a finite number of steps - that's what "induction up to epsilon_0"
tells us. So, the numbers must too!

In fact, Kirby and Paris showed that you *need* induction up to
epsilon_0 to prove Goodstein sequences always converge to zero.
Since you can't do induction up to epsilon_0 in Peano arithmetic,
thanks to Goedel and Gentzen, it follows that Peano arithmetic is
unable to prove the Goodstein sequences go to zero (unless Peano
arithmetic is inconsistent).

So, this is a nice example of a fact about arithmetic that's obvious
if you think about it for a while, but not provable in Peano arithmetic.

I don't know any results in mathematical physics that use induction
up to epsilon_0, but these could be one - after all, trees show up
in the theory of Feynman diagrams. That would be pretty interesting.

There's a lot more to say about this, but I hear what you're asking:
what comes after epsilon_0?

Well, duh! It's

epsilon_0 + 1

Then comes

epsilon_0 + 2

and then eventually we get to

epsilon_0 + omega

and then

epsilon_0 + omega^2,..., epsilon_0 + omega^3,... , epsilon_0 + omega^4,...

and after a long time

epsilon_0 + epsilon_0 = epsilon_0 2

and then eventually

epsilon_0^2

and then eventually...

Oh, I see! You want to know the first *really interesting* ordinal
after epsilon_0.

Well, this is a matter of taste, but you might be interested in
epsilon_1. This is the first ordinal after epsilon_0 that satisfies
this equation:

x = omega^x

How do we actually reach this ordinal? Well, just as epsilon_0
was the limit of this sequence:

omega, omega^omega, omega^{omega^omega}, omega^{omega^{omega^omega}},...

epsilon_1 is the limit of this:

epsilon_0 + 1, omega^{epsilon_0 + 1}, omega^{omega^{epsilon_0 + 1}},...

In other words, it's the *union* of all these well-ordered sets.

In what sense is epsilon_1 the "first really interesting ordinal" after
epsilon_0? I'm not sure! Maybe it's the first one that can't be
built out of 1, omega and epsilon_0 using finitely many additions,
multiplications and exponentiations. Does anyone out there know?

Anyway, the next really interesting ordinal I know after epsilon_1 is
epsilon_2. It's the next solution of

x = omega^x

and it's defined to be the limit of this sequence:

epsilon_1 + 1, omega^{epsilon_1 + 1}, omega^{omega^{epsilon_1 + 1}},...

Maybe now you get the pattern. In general, epsilon_alpha is the
alpha-th solution of

x = omega^x

and we can define this, if we're smart, for any ordinal alpha.

So, we can keep driving on through fields of ever larger ordinals:

epsilon_2,..., epsilon_3,..., epsilon_4, ...

and eventually

epsilon_omega,..., epsilon_{omega+1},..., epsilon_{omega+2},...

and eventually

epsilon_{omega^2},..., epsilon_{omega^3},..., epsilon_{omega^4},...

and eventually

epsilon_{omega^omega},..., epsilon_{omega^{omega^omega}},...

As you can see, this gets boring after a while - it's suspiciously
similar to the beginning of our trip through the ordinals, with
them now showing up as subscripts under this "epsilon" notation.
But this is misleading: we're moving much faster now. I'm skipping
over much bigger gaps, not bothering to mention all sorts of ordinals
like

epsilon_{omega^omega} + epsilon_{omega 248} + omega^{omega^{omega + 17}}

Anyway... so finally we *got* to this South Dakota Tractor Museum,
driving pretty darn fast at this point, about 85 miles an hour...
and guess what?

Oh - wait a minute - it's sort of interesting here:

epsilon_{epsilon_0},..., epsilon_{epsilon_1},..., epsilon_{epsilon_2}, ...

and now we reach

epsilon_{epsilon_omega}

and then

epsilon_{epsilon_{omega^omega}},...,

epsilon_{epsilon_{omega^{omega^omega}}},...

and then as we keep speeding up, we see:

epsilon_{epsilon_{epsilon_0},...

epsilon_{epsilon_{epsilon_{epsilon_0}}},...

epsilon_{epsilon_{epsilon_{epsilon_{epsilon_0}}}},...

So, by the time we got that tractor museum, we were driving really fast.
And, all we saw as we whizzed by was a bunch of rusty tractors out in
a field! It was over in a split second! It was a real anticlimax -
just like this little anecdote, in fact.

But that's the way it is when you're driving through these ordinals.
Every ordinal, no matter how large, looks pretty pathetic and small
compared to the ones ahead - so you keep speeding up, looking for a
really big one... and when you find one, you see it's part of a new
pattern, and that gets boring too...

Anyway, when we reach the limit of this sequence

epsilon_0,

epsilon_{epsilon_0},

epsilon_{epsilon_{epsilon_0},

epsilon_{epsilon_{epsilon_{epsilon_0}}},

epsilon_{epsilon_{epsilon_{epsilon_{epsilon_0}}}},...

our notation breaks down, since this is the first solution of

x = epsilon_x

We could make up a new name for this ordinal, like eta_0.

Then we could play the whole game again, defining eta_{alpha} to be
the alpha-th solution of

x = epsilon_x

sort of like how we defined the epsilons. This kind of equation, where
something equals some function of itself, is called a "fixed point"
equation.

But since we'll have to play this game infinitely often, we might
as well be more systematic about it!

As you can see, we keep running into new, qualitatively different types
of ordinals. First we ran into the powers of omega, then we ran into
the epsilons, and now these etas. It's going to keep happening! For
each type of ordinal, our notation runs out when we reach the first
"fixed point" - when the xth ordinal of this type is actually equal to
x.

So, instead of making up infinitely many Greek letters, let's use
phi_gamma for the gamma-th type of ordinal, and phi_gamma(alpha) for
the alpha-th ordinal of type gamma.

We can use the fixed point equation to define phi_{gamma+1} in terms
of phi_gamma. In other words, we start off by defining

phi_0(alpha) = omega^alpha

and then define

phi_{gamma+1}(alpha)

to be the alpha-th solution of

x = phi_{gamma}(x)

We can even define this stuff when gamma itself is infinite.
For a more precise definition see the Wikipedia article cited below...
but I hope you get the rough idea.

This defines a lot of really big ordinals, called the "Veblen hierarchy".

There's a souped-up version of Cantor normal form that can handle
every ordinal that's a finite sum of guys in the Veblen hierarchy:
you can write them *uniquely* as finite sums of the form

phi_{gamma_1}(alpha_1) + ... + phi_{gamma_k}(alpha_k)

where each term is less than or equal to the previous one, and each
alpha_i is not a fixed point of phi_{gamma_i}.

But as you might have suspected, not *all* ordinals can be written
in this way. For one thing, every ordinal we've reached so far is
*countable*: as a set you can put it in one-to-one correspondence
with the integers. There are much bigger *uncountable* ordinals -
at least if you believe you can well-order uncountable sets.

But even in the realm of the countable, we're nowhere near done!

As I hope you see, the power of the human mind to see a pattern
and formalize it gives the quest for large countable ordinals a
strange quality. As soon as we see a systematic way to generate
a sequence of larger and larger ordinals, we know this sequence
has a limit that's larger then all of those! And this opens the
door to even larger ones...

So, this whole journey feels a bit like trying to run away from
your own shadow: the faster you run, the faster it chases after you.
But, it's interesting to hear what happens next. At this point we
reach something a bit like the Badlands on the western edge of South
Dakota - something a bit spooky!

It's called the Feferman-Schuette ordinal, Gamma_0. This is just
the limit, or union if you prefer, of all the ordinals mentioned
so far: all the ones you can get from the Veblen hierarchy. You
can also define Gamma_0 by a fixed point property: it's the smallest
ordinal x with

phi_x(0) = x

Now, we've already seen that induction up to different ordinals
gives us different amounts of mathematical power: induction up
to omega is just ordinary mathematical induction as formalized by
Peano arithmetic, but induction up to epsilon_0 buys us more -
it let's us prove the consistency of Peano arithmetic!

Logicians including Feferman and Schuette have carried out a detailed
analysis of this subject. They know a lot about how much induction
up to different ordinals buys you. And apparently, induction up to
Gamma_0 let's us prove the consistency of a system called "predicative
analysis". I don't understand this, nor do I understand the claim
I've seen that Gamma_0 is the first ordinal that cannot be defined
predicatively - i.e., can't be defined without reference to itself.
Sure, saying Gamma_0 is the first solution of

phi_x(0) = x

is non-predicative. But what about saying that Gamma_0 is the union
of all ordinals in the Veblen hierarchy? What's non-predicative
about that?

If anyone could explain this in simple terms, I'd be much obliged.

As you can see, I'm getting out my depth here. That's pretty typical
in This Week's Finds, but this time - just to shock the world -
I'll take it as a cue to shut up. So, I won't try to explain the
outrageously large Bachmann-Howard ordinal, or the even more
outrageously large Church-Turing ordinal - the first one that can't
be written down using *any* computable system of notation. You'll
just have to read the references.

I urge you to start by reading the Wikipedia article on ordinal
numbers, then the article on ordinal arithmetic, and then the one
on large countable ordinals - they're really well-written:

2) Wikipedia, Ordinal numbers,
http://en.wikipedia.org/wiki/Ordinal_number

Ordinal arithmetic,
http://en.wikipedia.org/wiki/Ordinal_arithmetic

Large countable ordinals,
http://en.wikipedia.org/wiki/Large_countable_ordinals

The last one has a tempting bibliography, but warns us that most
books on this subject are hard to read and out of print. Apparently
nobody can agree on notation for ordinals beyond the Veblen hierarchy,
either.

Gentzen proved the consistency of Peano arithmetic in 1936:

3) Gerhard Gentzen, Die Widerspruchfreiheit der reinen Zahlentheorie,
Mathematische Annalen 112 (1936), 493-565. Translated as "The
consistency of arithmetic" in M. E. Szabo ed., The Collected Works
of Gerhard Gentzen, North-Holland, Amsterdam, 1969.

Goodstein's theorem came shortly afterwards:

4) R. Goodstein, On the restricted ordinal theorem, Journal of
Symbolic Logic, 9 (1944), 33-41.

but Kirby and Paris proved it independent of Peano arithmetic
only in 1982:

5) L. Kirby and J. Paris, Accessible independence results for Peano
arithmetic, Bull. London. Math. Soc. 14 (1982), 285-93.

That marvelous guy Alan Turing wrote his PhD thesis at Princeton
under the logician Alonzo Church. It was about ordinals and their
relation to logic:

6) Alan M. Turing, Systems of logic defined by ordinals, Proc.
London Math. Soc., Series 2, 45 (1939), 161-228.

This is regarded as his most difficult paper. The idea is to
take a system of logic like Peano arithmetic and throw in an
extra axiom saying that system is consistent, and then another
axiom saying *that* system is consistent, and so on ad infinitum -
getting a new system for each ordinal. These systems are recursively
axiomatizable up to (but not including) the Church-Turing ordinal.

These ideas were later developed much further...

But, reading original articles is not so easy, especially if you're
in Shanghai without access to a library. So, what about online stuff -
especially stuff for the amateur, like me?

Well, this article is great fun if you're looking for a readable
overview of the grand early days of proof theory, when Hilbert was
battling Brouwer, and then Goedel came and blew everyone away:

7) Jeremy Avigad and Erich H. Reck, "Clarifying the nature of the
infinite": the development of metamathematics and proof theory,
Carnegie-Mellon Technical Report CMU-PHIL-120, 2001. Also
available as http://www.andrew.cmu.edu/user/avigad/Papers/infinite.pdf

But, it doesn't say much about the newer stuff, like the idea that
induction up to a given ordinal can prove the consistency of a logical
system - the bigger the ordinal, the stronger the system. For work
up to 1960, this is a good overview:

8) Solomon Feferman, Highlights in proof theory, in Proof Theory,
eds. V. F. Hendricks et al, Kluwer, Dordrecht (2000), pp. 11-31.
Also available at http://math.stanford.edu/~feferman/papers.html

For newer stuff, try this:

9) Solomon Feferman, Proof theory since 1960, prepared for the
Encyclopedia of Philosophy Supplement, Macmillan Publishing Co.,
New York. Also available at
http://math.stanford.edu/~feferman/papers.html

Also try the stuff on proof theory, trees and categories mentioned
in "week227", and the book by Girard, Lafont and Taylor mentioned
in "week94".

Finally, sometime I want to get ahold of this book by someone who
always enlivened logic discussions on the internet until his death in
April this year:

10) Torkel Franzen, Inexhaustibility: A Non-Exhaustive Treatment,
Lecture Notes in Logic 16, A. K. Peters, Ltd., 2004.

The blurb sounds nice: "The inexhaustibility of mathematical
knowledge is treated based on the concept of transfinite
progressions of theories as conceived by Turing and Feferman."

Okay, now for a bit about the icosahedron - my favorite Platonic solid.

I've been thinking about the "geometric McKay correspondence" lately,
and among other things this sets up a nice relationship between the
symmetry group of the icosahedron and an amazing entity called E8.
E8 is the largest of the exceptional Lie groups - it's 248-dimensional.
It's related to the octonions (the number "8" is no coincidence) and
it shows up in string theory. It's very beautiful how this complicated
sounding stuff can be seen in distilled form in the icosahedron.

I have a lot to say about this, but you're probably worn out by our
road trip through the land of big ordinals. So for now, try "week164"
and "week230" if you're curious. Let's talk about something less
stressful - the early history of the icosahedron.

I spoke about the early history of the dodecahedron in "week63".
It's conjectured that the Greeks got interested in this shape
from looking at crystals of iron pyrite. These aren't regular
dodecahedra, since normal crystals can't have 5-fold symmetry -
though "quasicrystals" can. Instead, they're "pyritohedra".
The Greeks' love of mathematical perfection led them to the
regular dodecahedron...

... and it also led them to invent the icosahedron:

11) Benno Artmann, About the cover: the mathematical conquest of
the third dimension, Bulletin of the AMS, 43 (2006), 231-235.
Also available at
http://www.ams.org/bull/2006-43-02/S0273-0979-06-01111-6/

According to Artmann, an ancient note written in the margins of a copy
of Euclid's Elements says the regular icosahedron and octahedron
were discovered by Theaetetus!

If you're a cultured sort, you may know Theaetetus through Plato's
dialog of the same name, where he's described as a mathematical
genius. He's also mentioned in Plato's "The Sophist". He probably
discovered the icosahedron between 380 and 370 BC, and died at an
early age in 369. Euclid wrote his construction of the icosahedron
that we find in Euclid's Elements:

12) Euclid, Elements, Book XIII, Proposition 16, online version
due to David Joyce at
http://aleph0.clarku.edu/~djoyce/java/elements/bookXIII/propXIII16.html

Artmann says this was the first time a geometrical entity appeared
in pure thought before it was seen! An interesting thought.

Book XIII also contains a complete classification of the Platonic
solids - perhaps the first really interesting classification
theorem in mathematics, and certainly the first "ADE classification":

13) Euclid, Elements, Book XIII, Proposition 18, online version
due to David Joyce at
http://aleph0.clarku.edu/~djoyce/java/elements/bookXIII/propXIII18.html

If you don't know about ADE classifications, see "week62".

I got curious about this "ancient note written in the margins of a
copy of Euclid" that Artmann mentions. It seemed too good to be true.
Just for fun, I tried to track down the facts about this, using only
my web browser here in Shanghai.

First of all, if you're imagining an old book in a library somewhere
with marginal notes scribbled by a pal of Theaetetus, dream on.
It ain't that simple! Our knowledge of Euclid's original Elements
relies on copies of copies of copies... and centuries of detective
work, with each detective having to root through obscure journals
and dim-lit library basements to learn what the previous detectives
did.

The oldest traces of Euclid's Elements are pathetic fragments of
papyrus. People found some in a library roasted by the eruption
of Mount Vesuvius in 79 AD, some more in a garbage dump in the
Egyptian town of Oxyrhynchus (see "week221"), and a couple more in
the Fayum region near the Nile. All these were written centuries
after Euclid died. For a look at one, try this:

14) Bill Casselman, One of the oldest extant diagrams from Euclid,
http://www.math.ubc.ca/~cass/Euclid/papyrus/

The oldest nearly complete copy of the Elements lurks in a museum
called the Bodleian at Oxford. It dates back to 888 AD, about a
millennium after Euclid.

More copies date back to the 10th century; you can find their stories
here:

15) Thomas L. Heath, editor, Euclid's Elements, chap. V: the text,
Cambridge U. Press, Cambridge, 1925. Also available at
http://www.perseus.tufts.edu/cgi-bin/ptext?lookup=Euc.+5

16) Menso Folkerts, Euclid's Elements in Medieval Europe,
http://www.math.ubc.ca/~cass/Euclid/folkerts/folkerts.html

All these copies are somewhat different. So, getting at Euclid's
original Elements is as hard as sequencing the genome of Neanderthal
man, seeing a quark, or peering back to the Big Bang!

A lot of these copies contain "scholia": comments inserted by
various usually unnamed copyists. These were collected and
classified by a scholar named Heiberg in the late 1800s:

17) Thomas L. Heath, editor, Euclid's Elements, chap. VI: the scholia,
Cambride U. Press, Cambridge, 1925. Also available at
http://www.perseus.tufts.edu/cgi-bin/ptext?lookup=Euc.+6

One or more copies contains a scholium about Platonic solids in
book XIII. Which copies? Ah, for that I'll have to read Heiberg's
book when I get back to UC Riverside - our library has it, I'm
proud to say.

And, it turns out that another scholar named Hultsch argued
that this scholium was written by Geminus of Rhodes.

Geminus of Rhodes was an astronomer and mathematician who may have
lived between 130 and 60 BC. He seems like a cool dude. In his
Introduction to Astronomy, he broke open the "celestial sphere",
writing:

... we must not suppose that all the stars lie on one surface,
but rather that some of them are higher and some are lower.

And in his Theory of Mathematics, he proved a classification theorem
stating that the helix, the circle and the straight line are the only
curves for which any portion is the same shape as any other portion
with the same length.

Anyway, the first scholium in book XIII of Euclid's Elements, which
Hultsch attributes to Geminus, mentions

... the five so-called Platonic figures which, however, do not
belong to Plato, three of the five being due to the Pythagoreans,
namely the cube, the pyramid, and the dodecahedron, while the
octahedron and the icosahedron are due to Theaetetus.

So, that's what I know about the origin of the icosahedron!
Someday I'll read more, so let me make a note to myself:

18) Benno Artmann, Antike Darstellungen des Ikosaeders, Mitt.
DMV 13 (2005), 45-50. (Here the drawing of the icosahedron in
Euclid's elements is analysed in detail.)

19) A. E. Taylor, Plato: the Man and His Work, Dover Books, New
York, 2001, page 322. (This discusses traditions concerning
Theaetetus and Platonic solids.)

20) Euclid, Elementa: Libri XI-XIII cum appendicibus, postscript
by Johan Ludvig Heiberg, edited by Euangelos S. Stamatis,
Teubner BSB, Leipzig, 1969. (Apparently this contains information
on the scholium in book XIII of the Elements.)

Now for something a bit newer: categorification and quantum mechanics.
I've said so much about this already that I'm pretty much talked out:

21) John Baez and James Dolan, From finite sets to Feynman diagrams,
in Mathematics Unlimited - 2001 and Beyond, vol. 1, eds. Bjoern
Engquist and Wilfried Schmid, Springer, Berlin, 2001, pp. 29-50.

22) John Baez and Derek Wise, Quantization and Categorification,
Quantum Gravity Seminar lecture notes, available at:
http://math.ucr.edu/home/baez/qg-fall2003/
http://math.ucr.edu/home/baez/qg-winter2004/
http://math.ucr.edu/home/baez/qg-spring2004/

As I explained in "week185", many basic facts about harmonic
oscillators, Fock space and Feynman diagrams have combinatorial
interpretations. For example, the commutation relation between
the annihilation operator a and the creation operator a*:

aa* - a*a = 1

comes from the fact that if you have some balls in a box, there's one
more way to put a ball in and then take one out than to take one out
and then put one in! This way of thinking amounts to using finite
sets as a substitute for the usual eigenstates of the number operator,
so we're really "categorifying" the harmonic oscillator: giving it a
category of states instead of a set of states.

Working out the detailed consequences takes us through Joyal's
theory of "structure types" or "species" - see "week202" - and
on to more general "stuff types". Some nice category and
2-category theory is needed to make the ideas precise. For a
careful treatment, see this thesis by a student of Ross Street:

23) Simon Byrne, On Groupoids and Stuff, honors thesis,
Macquarie University, 2005, available at
http://www.maths.mq.edu.au/~street/ByrneHons.pdf and
http://math.ucr.edu/home/baez/qg-spring2004/ByrneHons.pdf

However, none of this work dealt with the all-important *phases*
in quantum mechanics! For that, we'd need a generalization of
finite sets whose cardinality can be be complex. And that's what
my student Jeffrey Morton introduces here:

24) Jeffrey Morton, Categorified algebra and quantum mechanics,
to appear in Theory and Application of Categories. Also available
as math.QA/0601458.

He starts from the beginning, explains how and why one would
try to categorify the harmonic oscillator, introduces the
"U(1)-sets" and "U(1)-stuff types" needed to do this, and shows
how the usual theorem expressing time evolution of a perturbed
oscillator as a sum over Feynman diagrams can be categorified.
His paper is now *the* place to read about this subject. Take
a look!

-----------------------------------------------------------------------
Previous issues of "This Week's Finds" and other expository articles on
mathematics and physics, as well as some of my research papers, can be
obtained at

http://math.ucr.edu/home/baez/

For a table of contents of all the issues of This Week's Finds, try

http://math.ucr.edu/home/baez/twfcontents.html

A simple jumping-off point to the old issues is available at

http://math.ucr.edu/home/baez/twfshort.html

If you just want the latest issue, go to

http://math.ucr.edu/home/baez/this.week.html
 
Physics news on Phys.org
  • #2
In article <ea7eq3$jk4$1@glue.ucr.edu>,
baez@math.removethis.ucr.andthis.edu (John Baez) writes:

> Cantor invented two kinds of infinities: cardinals and ordinals.
> Cardinals are more familiar. They say how big sets are. Two sets
> can be put into 1-1 correspondence iff they have the same number of
> elements - where this kind of "number" is a cardinal.
>
> But today I want to talk about ordinals. Ordinals say how big
> "well-ordered" sets are. A set is well-ordered if it's linearly
> ordered and every nonempty subset has a smallest element.


Anyone who enjoyed John's post on this should read INFINITY AND THE MIND
by Rudy Rucker. This is one of the few books I have read cover to cover
more than once (along with ALICE IN WONDERLAND, THROUGH THE LOOKING
GLASS (both in the editions annotated by Martin Gardner), ZEN AND THE
ART OF MOTORCYCLE MAINTENANCE and LILA).

Author's web page:

http://www.mathcs.sjsu.edu/faculty/rucker/

Book's web page:

http://pup.princeton.edu/titles/5656.html

He also has a tongue-in-cheek science-fiction novel called WHITE LIGHT
which deals with infinity (in a mathematical sense). Apart from these
two books, I have also read and recommend his novel SOFTWARE. I would
be interested (via email, not via the newsgroup) on comments on his
other books.
 
Last edited by a moderator:
  • #3
Omegad!

Rgds

Ian Macmillan
 
  • #4
In article <ea8l01$27t$1@online.de>,
Phillip Helbig wrote:

>Anyone who enjoyed John's post on this should read INFINITY AND THE MIND
>by Rudy Rucker.


I enjoyed my post, so maybe I should read this book!

These large ordinals are a bit off topic from physics, but I need
to make some corrections:

In article <ea83ig$qmq$1@news.ks.uiuc.edu>,
John Baez <baez@math.removethis.ucr.andthis.edu> wrote:

>At first these numbers seem to keep getting bigger! So, it seems
>shocking at first that they eventually reach zero. For example,
>if you start with the number 4, you get this Goodstein sequence:
>
>4, 26, 41, 60, 41, 60, 83, 109, 139, 173, 211, 253, 299, 348, ...
>
>and apparently it takes about 3 x 10^{60605351} steps to reach zero!


Kevin Buzzard pointed out a typo here. The sequence is:

4, 26, 41, 60, 83, 109, 139, 173, 211, 253, 299, 348, ...

Also, while I got the huge number above from this website:

http://curvebank.calstatela.edu/goodstein/goodstein.htm

he pointed out they actually say the sequence "can increase for
approximately 2.6 * 10^{60605351} steps", not that it reaches
zero at this point.

I'm not sure what it means to say the "can increase" for this long -
it either does or doesn't, right?

Anyway, Kevin worked out the details himself, and I checked his
calculations. We now seem to agree that the sequence increases
until the ith term, where

i = (1/4) 24 2^{24} 2^{24 2^{24}} - 2 ~ 1.72 x 10^{121210694}

Then it levels off, and eventually it decreases, reaching zero at
the ith term for

i = 24 * 2^24 * 2^{24 * 2^{24}} - 2 ~ 6.9 * 10^{121210694}

I wish some people would check and see if we've done the calculation
correctly. It's basically just algebra, somewhat intimidating at
first - but it gave me quite a sense of power when I got into it.

Here is Kevin's email, prettied up by me, but perhaps with some
mistakes added:

> apparently it takes about 3 x 10^{60605351} steps to reach zero!


You write this as if it were some kind of mystery. I remember working
out this number explicitly when I was a graduate student! There is
some nice form for it, as I recall. Let's see if I can reconstruct
what I did.

If I've understood the sequence correctly, it should be (where "n)"
at the beginning of a line denotes we're working in base n on this
line, so strictly speaking it's probably the n-1st term in the sequence)

2) 2^2 = 4
3) 3^3-1 = 2.3^2+2.3+2 = 26 [note: base 3, ends in 2, and 3+2=5]
4) 2.4^2+2.4+1 = 41 [note: base 4, ends in 1, and 4+1=5]
5) 2.5^2+2.5 = 60 [we're at a limit ordinal here, note 3+2=4+1=5]
6) 2.6^2+2.6-1 = 2.6^2+6+5 = 83 [note: base 6, ends in 5]
7) 2.7^2+7+4 [note: base 7, ends in 4]
8) 2.8^2+8+3 [note: base 8, ends in 3, so we next get a limit ordinal at...]
 
  • #5
In article <eahpuk$bli$1@glue.ucr.edu>,
baez@math.removethis.ucr.andthis.edu (John Baez) writes:

> These large ordinals are a bit off topic from physics


To bring things back on topic, what branches of physics, if any, use
large ordinals, various kinds of infinity etc? Of course, not ALL
mathematics is used in physics. On the other hand, some branches were
once thought to be completely irrelevant to physics---group theory, for
example. (I don't recall if it was Rutherford or Lord Kelvin who
claimed this.)
 
  • #6
In article <eamp71$t2f$1@online.de>, Phillip Helbig wrote:

>In article <eahpuk$bli$1@glue.ucr.edu>,
>baez@math.removethis.ucr.andthis.edu (John Baez) writes:


>> These large ordinals are a bit off topic from physics


>To bring things back on topic, what branches of physics, if any, use
>large ordinals, various kinds of infinity etc?


As for the large countable ordinals I was mentioning, I don't know
any work in physics that uses them beyond omega or maybe omega^n
for finite n. Certainly no physicist with any brains would object
to the proof that Goodstein sequences converge to zero, and you can
only prove this by induction up to epsilon_0. But, I've never
seen Goodstein sequences used in physics.

However, it's worth emphasizing that these ordinals are not exotic
entities. The ordinals I mentioned are all countable ordered sets,
and you can describe them all as subsets of the rational numbers.

More precisely: any countable ordered set is isomorphic, by an
order-preserving map, to a subset of the rational numbers.

Moreover, I believe that for any countable ordinal up to (but not
including) the Church-Turing ordinal, you can write a computer
program that will decide whether or not any given fraction is in
this subset. As a consequence, you can also write a computer program
that lists the fractions in this set.

It's pretty obvious how to do this for omega^2:

http://math.ucr.edu/home/baez/omega_squared.png

But, I believe that you can do it for all the ordinals I mentioned,
except for the Church-Turing ordinal. David Madore has drawn a picture
of epsilon_0, for example - see sci.math.research for more on that.

In short: countable ordinals below the Church-Turing ordinal are
nothing scary. But, I haven't seen them used in physics, and I
don't really expect it.

What about cardinals?

Well, physicists routinely use real numbers, which have a cardinality
much larger than anything mentioned so far. Indeed, unless you add
extra axioms to ordinary ZFC set theory, there's no telling how large
the cardinality of the real numbers is!

But, I'm also pretty sure that any calculation that predicts a concrete
physical result can be done using only finite sets. And, the set of
of *computable* real numbers is countable.

So, for the purposes of physics, cardinals above aleph_0 are more a
mathematical convenience than a necessity.

However, every sufficiently convenient convenience eventually becomes
a necessity.

For example, flush toilets. In the West it might almost seem to be a
*necessity* that public places have flush toilets - as opposed to, say,
holes in the floor that you squat over. But here in Shanghai, many don't.

In short, infinite sets resemble flush toilets. This is not to say that
they're full of... oh, never mind, I think I've taken this analogy far
enough.

>Of course, not ALL
>mathematics is used in physics. On the other hand, some branches were
>once thought to be completely irrelevant to physics---group theory, for
>example. (I don't recall if it was Rutherford or Lord Kelvin who
>claimed this.)


Lord Kelvin is mainly noted for having dismissed *vectors* as
unnecessary to physics. He wrote:

Quaternions came from Hamilton after his really good work had been
done; and though beautifully ingenious, have been an unmixed evil
to those who have touched them in any way, including Maxwell.
Vector is a useless survival, or offshoot from quaternions, and has
never been of the slightest use to any creature.

To understand this, remember that J. Willard Gibbs, the first person
to get a math PhD in the USA, introduced the modern approach to vectors
around 1881, long after Hamilton's quaternions first became popular. He
took the quaternion and chopped it into its "scalar" and "vector" parts.

Vectors are another great example of a convenience that's so convenient
that they're now seen as a necessity.

It's mainly the American physicist John Slater, inventor of the "Slater
determinant", who is famous for having dismissed groups as unnecessary
to physics. He wrote:

It was at this point that Wigner, Hund, Heitler, and Weyl entered the
picture with their "Gruppenpest": the pest of the group theory [actually,
the correct translation is "the group plague"] ... The authors of the
"Gruppenpest" wrote papers which were incomprehensible to those like
me who had not studied group theory... The practical consequences
appeared to be negligible, but everyone felt that to be in the mainstream
one had to learn about it. I had what I can only describe as a feeling
of outrage at the turn which the subject had taken ... it was obvious
that a great many other physicists we are disgusted as I had been with the
group-theoretical approach to the problem. As I heard later, there were
remarks made such as "Slater has slain the 'Gruppenpest'". I believe
that no other piece of work I have done was so universally popular.

And now, of course, it's categories that some physicists dismiss, just
as they're catching on.

So, judging by the history, you can be almost sure that if a bunch of
physicists angrily dismiss a branch of mathematics as useless to physics,
it's useful for physics. The branches of math that don't yet have
applications to physics don't arouse such controversy!
 
  • #7
John Baez wrote:
>
> Well, physicists routinely use real numbers, which have a cardinality
> much larger than anything mentioned so far. Indeed, unless you add
> extra axioms to ordinary ZFC set theory, there's no telling how large
> the cardinality of the real numbers is!
>
> But, I'm also pretty sure that any calculation that predicts a concrete
> physical result can be done using only finite sets. And, the set of
> of *computable* real numbers is countable.
>
> So, for the purposes of physics, cardinals above aleph_0 are more a
> mathematical convenience than a necessity.
>
> However, every sufficiently convenient convenience eventually becomes
> a necessity.
>
> For example, flush toilets. In the West it might almost seem to be a
> *necessity* that public places have flush toilets - as opposed to, say,
> holes in the floor that you squat over. But here in Shanghai, many don't.
>
> In short, infinite sets resemble flush toilets. This is not to say that
> they're full of... oh, never mind, I think I've taken this analogy far
> enough.
>
>
> Lord Kelvin is mainly noted for having dismissed *vectors* as
> unnecessary to physics. He wrote:
>
> Quaternions came from Hamilton after his really good work had been
> done; and though beautifully ingenious, have been an unmixed evil
> to those who have touched them in any way, including Maxwell.
> Vector is a useless survival, or offshoot from quaternions, and has
> never been of the slightest use to any creature.
>
> To understand this, remember that J. Willard Gibbs, the first person
> to get a math PhD in the USA, introduced the modern approach to vectors
> around 1881, long after Hamilton's quaternions first became popular. He
> took the quaternion and chopped it into its "scalar" and "vector" parts.
>
> Vectors are another great example of a convenience that's so convenient
> that they're now seen as a necessity.
>
> It's mainly the American physicist John Slater, inventor of the "Slater
> determinant", who is famous for having dismissed groups as unnecessary
> to physics. He wrote:
>
> It was at this point that Wigner, Hund, Heitler, and Weyl entered the
> picture with their "Gruppenpest": the pest of the group theory [actually,
> the correct translation is "the group plague"] ... The authors of the
> "Gruppenpest" wrote papers which were incomprehensible to those like
> me who had not studied group theory... The practical consequences
> appeared to be negligible, but everyone felt that to be in the mainstream
> one had to learn about it. I had what I can only describe as a feeling
> of outrage at the turn which the subject had taken ... it was obvious
> that a great many other physicists we are disgusted as I had been with the
> group-theoretical approach to the problem. As I heard later, there were
> remarks made such as "Slater has slain the 'Gruppenpest'". I believe
> that no other piece of work I have done was so universally popular.
>
> And now, of course, it's categories that some physicists dismiss, just
> as they're catching on.
>
> So, judging by the history, you can be almost sure that if a bunch of
> physicists angrily dismiss a branch of mathematics as useless to physics,
> it's useful for physics.


No. In the examples above, there _were_ already significant uses
in physics, and only those unfamiliar with the techniques called
then superfluous.Arnold Neumaier
 
  • #8
In article <44D5C38D.3010105@univie.ac.at>,
Arnold Neumaier <Arnold.Neumaier@univie.ac.at> wrote:

>John Baez wrote:


>> Lord Kelvin is mainly noted for having dismissed *vectors* as
>> unnecessary to physics.


>> It's mainly the American physicist John Slater, inventor of the "Slater
>> determinant", who is famous for having dismissed groups as unnecessary
>> to physics.


>> And now, of course, it's categories that some physicists dismiss, just
>> as they're catching on.


>> So, judging by the history, you can be almost sure that if a bunch of
>> physicists angrily dismiss a branch of mathematics as useless to physics,
>> it's useful for physics.


>No. In the examples above, there _were_ already significant uses
>in physics, and only those unfamiliar with the techniques called
>then superfluous.


Exactly my point. If there _weren't_ significant uses of those
mathematical techniques in physics, there wouldn't be physicists
running around using that math, so there wouldn't be famous physicists
getting upset and claiming those techniques were superfluous.

For example, you don't have famous physicists claiming that large
cardinals are superfluous to physics, precisely because these mostly
*are* superfluous to physics - so nobody uses them, so nobody in his right
mind complains about someone else using them. If tomorrow I read in
sci.physics.research that Hawking wrote a polemic decrying the use
of large cardinal hypotheses in physics, I'd immediately guess someone
had found a use for them.

.............

Puzzle 28: Why should you be careful if you meet someone whose passport
is from the British West Indies?

If you get stuck, try:

http://math.ucr.edu/home/baez/puzzles/
 

1. What is "This Week's Finds in Mathematical Physics (Week 236)"?

"This Week's Finds in Mathematical Physics" is a series of online articles written by John Baez, a mathematician and physicist, where he discusses interesting and current topics in the intersection of mathematics and physics. Week 236 is the 236th article in this series.

2. Who is John Baez?

John Baez is a professor of mathematics at the University of California, Riverside. He is known for his work in mathematical physics, particularly in the areas of gauge theory and n-categories. He is also a science writer and runs the popular blog "Azimuth".

3. What kind of topics are covered in "This Week's Finds in Mathematical Physics (Week 236)"?

Week 236 covers a variety of topics, including but not limited to quantum mechanics, topology, and category theory. Baez often discusses recent research papers and their implications in these fields.

4. Is "This Week's Finds in Mathematical Physics" suitable for non-experts?

While the topics covered in "This Week's Finds in Mathematical Physics" can be quite technical, Baez does a great job of explaining them in an accessible and engaging way. Non-experts with a basic understanding of mathematics and physics can still enjoy and learn from these articles.

5. How often is "This Week's Finds in Mathematical Physics" published?

As the name suggests, "This Week's Finds in Mathematical Physics" is typically published once a week. However, there may be occasional breaks or delays in the publishing schedule. Interested readers can subscribe to Baez's blog "Azimuth" to receive updates on new articles.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
Replies
1
Views
939
  • Introductory Physics Homework Help
Replies
17
Views
380
  • Other Physics Topics
Replies
1
Views
635
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Introductory Physics Homework Help
Replies
6
Views
236
  • Engineering and Comp Sci Homework Help
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
15
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
28
Views
5K
Back
Top