I actually agree with humanino, but out of the two given I'd have to go with Newton.
Einstein wasn't particularly good in mathematics, he had to pick Hilbert's brain for a lot of GR-related stuff.
And you're trying to pit him against the guy who invented calculus.
Who divided by zero numerous times to invent it?
Both produced some 'tricks' which had tremendous use. But Einstein's work wasn't that endeavouring, and Newton's work was simply inconsistent, he divided by zero, come on.
How big of a coincidence is it that he and Leibniz invented calculus at approximately the same time?
Maybe it's not. Maybe calculus was just a natural progression from DesCarte's analytic geometry.
In any event, I still think Newton made more fundamental contributions than Einstein.
My vote goes to Forest, Forest Gump.
He invented the happy face. He was so renouned, 24 US presidents shook his hand and LBJ even examined his but tox.
Einstein didn't make any contribution to mathematics as far as I know, calculus can hardly be called fundamental. Both made some very fundamental contributions to physics though.
Err, what? Calculus is a fundamental part of almost all math above high school algebra.
Really now? where does one encounter calculus (more properly called infinitesimal calculus) in:
- linear algebra
- proof theory
- set theory
- propositional logic
- first order logic
- functional analysis
- number theory
Calculus is probably the least fundamental part of maths out there. Calculus is founded on analysis, analysis is founded on topology, topology is founded on set theory, set theory is founded on first order logic, itself founded on proof theory. Nothing I know of in mathematics is founded on infinitesimal calculus, but maybe you can teach me some new things here.
Infinitesimal calculus is simply a useful tool that can be used to calculate some magnitudes, there is no fundamental research going on in it.
I do not think it is very wise to display such strong opinions towards what is "more fundamental". The interplay between analysis and algebra remains at all stages of mathematical sophistication. Consider linear algebra : a lot of Hilbert space constructions were motivated by harmonic analysis. Numerous theorems in number theory are obtained using complex analysis. In fact, I'll just quote Riemann's hypothesis : from the definition of the hypothesis to the latest bright idea to try to prove it through the entire history of the problem, we keep going back and forth between analysis and algebra.
Fundamental is easy to define. You can express / formulate calculus in set theory, but not the reverse; thus set theory is more fundamental.
Algebra isn't exactly fundamental either. Algebra uses numbers and the operations thereon and accepts them as existing axioms, fundamental mathematics is more interested in first defining what a number is in a given context, what a certain operation on numbers is.
Riemann Hypothesis isn't as much fundamental as it is far-reaching. An example of a fundamental hypothesis would be the Church-Turing thesis.
Well, more fundamental or not: in the context of the OP's question, which is about mathematical brilliance, Newton was pretty mathematically brilliant to invent calculus.
Brilliant maybe, but mathematically brilliant hardly.
Calculus is mathematically dubious, it just had profoundly wide application and use, but a work of mathematics it's not. It's basically just a trick, the larger trick is to disguise the fact that you divide by zero.
Why do you minimize that as a "trick"? We can't divide by zero because it's forbidden, but the success of calculus comes from the fact that it is very often very useful to do so.
Are you going to split hairs and suggest that it's not enough to invent something spectacularly useful and succesful?
Why is it forbidden you might ask yourself?
Because zero has no multiplicative inverse, after all, division by x is defined as multiplying by the multiplicative inverse of x.
The multiplicative inverse of a real number x is a number y such that x multiplied by y results into 1.
It is provably that each and every real number has exactly one such multiplicative inverse, except 0, and no real number has 0 as multiplicative inverse. Because of course the inverse of the inverse is the number itself, a thing that's also provable.
As you said, it is very useful, it's also very useful to treat pi as 3.14 in most circumstances, because the result, though only an approximation, is close enough to what we need, though doing so is where you stop performing mathematics.
No, I'm just saying that calculus how Newton invented it is not mathematics.
The invention of the mirror was also highly useful, does that make it mathematically brilliant? Of course not, though one could argue that it was brilliant on its own.
I am not using the terms "analysis" and "algebra" as specific branches which for instance could be taught in school. I am referring to a more general split of all mathematical concepts.
No, I did not ask myself that. The rest of what you said is irrelevant, but thanks for sharing.
How is calculus not mathematics? It's like saying the invention of the mirror is not about optics.
I believe it to be quite relevant.
Well, I doubt the person that invented the mirror knew any thing about optics, in fact, I think it for the most part was just dumb luck to be honest.
Newton's Calculus is not mathematics because one divides by zero. Or at least, he offered little explanation to the existence of an object dx which we can add to a number r to produce r again. (therefore it must be zero, or we must define addition on some larger set) and then divides by it randomly as if it's not zero.
It's not mathematics for the same reason that 'proving' the Riemann Hypothesis by saying 'Okay, we found a thousand cases where it applies no and no counter example, it then must be true', is not mathematics, it may be useful, and this is how most empirical sciences work, but it's not how mathematics works.
What split is that? You mean there is some binary (or higher) split between all branches of mathematics? I fail to understand what you mean.
What I mean is that calculus is not mathematics, analysis is mathematics, but not fundamental mathematics.
You had to put words in my mouth to justify explaining it. Everyone knows why dividing by zero is forbidden. It doesn't say anything about calculus.
Ah, that's the answer.
You're not saying it's not mathematics, you're saying is not rigorous.
You could have been a little more forthright.
Analysis and algebra. All mathematicians are familiar with this split, as it corresponds to real occurring preferences among professionals. I no of no mathematician who would claim their preference to be "superior" or more fundamental to the other one. One can take the list of Field medalists and classify the work accordingly. I'm pretty sure Perelman's work for which he was attributed the Field medal would fall in the "analysis" category for instance, although I do not know him so I do not know his personal preference. So would Tao's Field medal work, or Wendelin Werner Field medal work, or René Thom's Field medal work.
The reason I am quoting Field medal work falling in the category of analysis, is that my own preference is algebra.
Are you a published researcher in mathematics ?
What? This is new to me?
So ehh, what does proof theory fall into? Algebra? What does linear algebra fall into?
I take it that if topology falls into analysis by this schism that linear algebra also falls in analysis?
No, do you need one to back up my claim that in mathematics, finding a thousand positive examples to the Riemann Hypothesis and no negative example is enough a substantiation for the claim?
Hmm, I don't know about you, but when I still attended university saying 'it's not mathematical' was essentially the same thing as saying 'it's not rigorous'. I would call it 'a useful trick' opposed to mathematics if it lacks rigour.
What do you mean by "divide by zero"? Is it the quotient dy/dx , where dx is approaching zero? It is the limit of dy/dx when dx approaches zero, that is meant. If for instance a line y = kx + l , then that quotient dy/dx = k however small you make dx. That must be easy realize.
Separate names with a comma.