Nobody complains about physicists' math?

  • Thread starter Thread starter jostpuur
  • Start date Start date
  • Tags Tags
    Physicists
Click For Summary
The discussion centers on the perception that physicists' use of mathematics often lacks the rigor found in pure mathematics, yet it still yields correct results in practical applications. Participants note that while mathematicians may find humor in physicists' approaches, this is rarely discussed openly. The conversation highlights how physicists sometimes rely on informal methods, such as using differentials without rigorous justification, to simplify complex concepts. Despite these perceived shortcomings, many physicists successfully derive accurate results, suggesting a balance between intuition and mathematical formality. Ultimately, the thread raises questions about the validity and acceptance of physicists' mathematical practices within the broader scientific community.
  • #31
jostpuur said:
This trickery is irrational, because the equation

<br /> df = \frac{\partial f}{\partial x_1} dx_1 + \frac{\partial f}{\partial x_2} dx_2<br />

comes out from nowhere. If you want to use mathematics as a tool, why not just take the chain rule as it is, and then use it? The physicists could also merely write

<br /> \frac{df}{du} = \frac{\partial f}{\partial x_1} \frac{dx_1}{du} + \frac{\partial f}{\partial x_2} \frac{dx_2}{du}<br />

and say "this is a known result, and we can use it".

Why start with something else, and then do some kind of pseudo proof for the chain rule? And why insist, that this pseudo proof was the easier way?

But you still haven't told me what is WRONG with it. All you have done is argue it based on a matter of TASTES.

That's like arguing that one shouldn't use a screw driver to open the lid of a can of paint, because the screw driver was designed to be used in a certain way. You seem to forget the important point here : it works!

That phrase "it works" has always been severely undermined. Yet, it is THE most powerful argument there is. As long as the usage of the "tool" does not break any "laws" (I didn't use the screw driver to murder someone to get to that paint can to paint my house), then the claim that it works validates its usage. That is why I asked you what is mathematically wrong with it. I'm not talking about "canceling" 0/0, which would be breaking mathematical "laws", and in fact, results in something that doesn't work (look at the Fraunhoffer diffraction pattern). I'm talking about shortcut in notations that you are highlighting here.

Most physicists use mathematics as a tool, whether you like it or not. We need to know what the tools are, and how to use it correctly within its limits of validity. Once we know that, HOW we use it really shouldn't be the sore point of mathematicians. Physicists really do not have the patience nor the inclinations to focus on the "tools". If we do, what's the use of mathematicians?

In the experimental facility that I work at, we have this stainless steel plate that's mounted on a low ceiling by 4 bolts. The ends of the bolts stick down from the plate. While it is high enough for most of us not to hit it, someone around 6' tall or taller could hit his/her head on it. The safety regulation requires us to do something about it, and this could include shaving the bolt to a shorter length, putting in a cover over the whole contraption, etc... But we came out with something easier. Puncture 4 tennis balls, and stick the protruding ends of the bolt into the tennis balls. The bright, fluorescent color provides the advance warning to anyone approaching that area, and even if someone hits his/her head on it, it would not hurt. It was a quick, easy, and CHEAP solution to a problem. Yet, we had used something for what it wasn't meant to be used. Even our safety inspector was impressed. Why? Because IT WORKS!

The ability of physicists to adapt the mathematics to suit their needs, without really violating any mathematical laws, shows their creativity and imagination to solve the problem at hand. I'm sure this is done in many other fields as well, especially in engineering. I really don't understand why this would be a subject of ridicule or to be laughed at. In fact, I would think that the ability to take something and use it in a different manner while still maintaining its validity, is something that should be admired.

Zz.
 
Physics news on Phys.org
  • #32
Thrice said:
Ah physics, wherein every function equals the first term of its taylor series.
The sum of the first and second terms no?
 
  • #33
jimmysnyder said:
It's defined that way for convenience, not as the result of any proof. Surely a mathematician would be justified in laughing at someone who tried to prove definitions.

Are you sure that this is not derived via the Gamma function?

Zz.
 
  • #34
My favorite story about such laughings comes from Feynman (who else?). I'm mangling his story, but here goes:

A small group of math majors was looking at a french curve and pondering its shape. A french curve is a piece of plastic (or other material) with an edge that looks somewhat spirally. It's used by draftsmen in drawing curved lines. Because of the spirally edge, it approximates arcs of any radius in a range. The mathematicians knew this, but pondered why it had the particular shape that it had. Was it optimal in some way. Feynman told them that this particular shape had the following property. In no matter what orientation you hold it, the tangent at the bottom was parallel to the ground. The math majors satisfied themselves that this was true and thanked him for his help.

What he told them would be true of any curve. That's the fundamental theorem of calculus, that the slope of a curve at an extremum is zero.
 
  • #35
ZapperZ said:
Are you sure that this is not derived via the Gamma function?

Zz.
Yes. 0! is defined to be 1. The Gamma function is a late comer.
 
Last edited:
  • #36
The factorial function is what it is. What is "definition", "derivation", "calculation", "demonstration", "theorem" or whatever is merely an artifact of the way its being presented.

Axiomizations for theories are like spanning sets for vector spaces -- they are useful for presentation and calculation, but otherwise irrelevant. (In fact, both are examples of the more general notion of a "set of generators")

(The above is for a developed theory. Research is a different story)
 
Last edited:
  • #37
Do physicists often get their answers correct? Funny I'd expect a remotely tangible and unified theory of reality if that were the case.
There is little unification in physics which on the other hand is a strong point of mathematics.

Indeed physicists and engineers can be amusing in their use of mathematics. For the trivialties if nothing else. However around the 1900s any pure mathematician would have been appalled by the manner physicists used mathematics just as rich new ideas were being elaborated upon. Hilbert sought to change this and it is through his part as much as any other(Poincare, von Neumann, etc.) that we have such stable(if not all that absolutely descriptive) theories of quantum phenomena, etc.
In physics, particularly elementary physics rigour is given second place to something they call intuitive obviousness. This leads to indecent representation of approximation(where such simple things as the constant use of the equality sign in the place of approximation make me want to throw up) and lax(to the point of being incorrect) definitions.
The importance of mathematics in the proverbial 'equation' is when we finally understand a significant part of a certain theory we can have a rigorous mathematical representation of it.
Mathematics is a judge of our understanding of something regardless of what an intuitive grasp may mean
 
  • #38
Dragonfall said:
I don't laugh at the way they do math. I'm terrified at it. Think about it the next time you're crossing a bridge or ride an elevator.

Things don't fall down all that often :smile:
Most of all that is improper maintenance.

Aren't you glad that physicists have nothing to do with the operation of the universe :smile:
 
  • #39
yasiru89 said:
Do physicists often get their answers correct? Funny I'd expect a remotely tangible and unified theory of reality if that were the case.
There is little unification in physics which on the other hand is a strong point of mathematics.

Indeed physicists and engineers can be amusing in their use of mathematics. For the trivialties if nothing else. However around the 1900s any pure mathematician would have been appalled by the manner physicists used mathematics just as rich new ideas were being elaborated upon. Hilbert sought to change this and it is through his part as much as any other(Poincare, von Neumann, etc.) that we have such stable(if not all that absolutely descriptive) theories of quantum phenomena, etc.
In physics, particularly elementary physics rigour is given second place to something they call intuitive obviousness. This leads to indecent representation of approximation(where such simple things as the constant use of the equality sign in the place of approximation make me want to throw up) and lax(to the point of being incorrect) definitions.
The importance of mathematics in the proverbial 'equation' is when we finally understand a significant part of a certain theory we can have a rigorous mathematical representation of it.
Mathematics is a judge of our understanding of something regardless of what an intuitive grasp may mean

Then maybe you'd like to tackle the mathematics of many-body physics without using any kind of intuition whatsoever and no amount of approximation. When you can do that and derive something similar to the Fermi Liquid theory, or superconductivity, then maybe what you said here have some validity.

Zz.
 
  • #40
ZapperZ said:
Then maybe you'd like to tackle the mathematics of many-body physics without using any kind of intuition whatsoever and no amount of approximation. When you can do that and derive something similar to the Fermi Liquid theory, or superconductivity, then maybe what you said here have some validity.

Zz.
Maybe I'm just tired, but this doesn't sound like it relates at all to what yasiru89 said. He didn't say that one shouldn't approximate -- he's complaining about the practice of not acknowledging that an approximation was used. Nor did he say that one shouldn't use one's intuition.
 
  • #41
Hurkyl said:
Maybe I'm just tired, but this doesn't sound like it relates at all to what yasiru89 said. He didn't say that one shouldn't approximate -- he's complaining about the practice of not acknowledging that an approximation was used. Nor did he say that one shouldn't use one's intuition.

Er.. but that doesn't make any sense either. Where exactly are such approximation not mentioned? Just because some sloppiness in substituting an approximation into an equality? That's it?

I think we need to give these physicist A BIT of credit in terms of intelligence to know when such approximations are being made. When I read about the "mean field theorem", I know perfectly well what kind of approximation is being made, even when the potential is written as an equality.

If this is what is making someone cringe to the point of throwing up, I think the problem here is elsewhere and not with the physics/physicists.

Zz.
 
  • #42
It seems necessary to spell out 'definition' to most in this forum- it is key to basis and hence understanding.
Zz, Hurkyl- what jostpuur is trying to say is: 'did the physicists pull the infinitesimals out
of their a&$#?'
A combinatorial proof of 0! = 1 is elementary and even obvious. Therein it justifies itself as more than a convenience. Think how many ways you can pick 0. Time to hit the books again. The fact that this coincides perfectly with the generalised form- the Gamma function merely establishes a connexion that allows Gamma to encompass the factorial.
When Hurkyl says 'the factorial is what it is' what the hell does he mean? Or is that unimportant?
When validities are overstretched your 'tools' need attention and lack of it is by no means 'creative', it is a blunder of extreme proportions.
Mathematicians really have tried, by rectifying certain results by reducing them to the rigorous theory of limits(most of those that failed have been rejected) and bringing a representation theory basis using Regularisations to the renormalisations, but while these may be exceptional cases(the uses in number theory and groups) most others aren't really our problems.
 
  • #43
I find finesse and subtlety lost. If this is the manner in which physicists treat things aren't ALL your efforts wasted if the universe(or multiverse) were a subtle creature?
 
  • #44
yasiru89 said:
It seems necessary to spell out 'definition' to most in this forum- it is key to basis and hence understanding.
Zz, Hurkyl- what jostpuur is trying to say is: 'did the physicists pull the infinitesimals out
of their a&$#?'
A combinatorial proof of 0! = 1 is elementary and even obvious. Therein it justifies itself as more than a convenience. Think how many ways you can pick 0. Time to hit the books again. The fact that this coincides perfectly with the generalised form- the Gamma function merely establishes a connexion that allows Gamma to encompass the factorial.
When Hurkyl says 'the factorial is what it is' what the hell does he mean? Or is that unimportant?
When validities are overstretched your 'tools' need attention and lack of it is by no means 'creative', it is a blunder of extreme proportions.
Mathematicians really have tried, by rectifying certain results by reducing them to the rigorous theory of limits(most of those that failed have been rejected) and bringing a representation theory basis using Regularisations to the renormalisations, but while these may be exceptional cases(the uses in number theory and groups) most others aren't really our problems.

If it is a blunder of extreme proportions, and it is so common that it is making you sick, then you should, by now, have a ton of papers and rebuttals to all those physics papers that actually are based on such mathematical errors. Am I correct?

Or are these, as I've said before, not really mathematical errors, but rather simply a matter of TASTES? Again, all I've seen in this thread is not really actually mathematical errors, which as physicists, even we don't want to do, but rather sloppiness in usage or notations. This is what is making you want to throw up? Really?!

If these are mathematical errors, and I'm a mathematician, I would have not waited a second to write a rebuttal to such things. Look at the Laughlin wavefunction in his Nobel Prize winning PRL paper on the fractional quantum hall effect and I can point out to you several "approximations" that he made based nothing more than intuitions. And yes, he wrote those as "equality", thank you. I'd like to see you write a rebuttal arguing to the "extreme blunder" in that paper.

Again, as I've suspected, my earlier argument that IT WORKS, is completely being ignored, as if that means absolutely nothing.

Zz.
 
  • #45
The "blunder of extreme proportions" is the general disdain for any sort of mathematical sophistication. Frankly, I'm dumbfounded that you would actually praise someone who gave a lecture like the one jostpuur described. This isn't "adapting mathematics to suit their needs" -- this is "those weirdos down the hall have known all about this stuff for a long time, but by golly I'm going to make you guess at what's going on..." and quite probably "...because I don't know what's going on".


Again, as I've suspected, my earlier argument that IT WORKS, is completely being ignored, as if that means absolutely nothing.
It doesn't mean as much as you seem to think it means. Just because something works doesn't mean it's a good way of doing things. Eschewing any sort of mathematical sophistication impairs students because they have to learn many ideas through osmosis, impairs experts because they can't adequately convey their intuition to others or to themselves, and impairs the entire subject because it deters experts in other fields from pursuing an interest.
 
  • #46
Sloppiness in notation and more importantly definition simply cannot be overlooked. I'm not telling you that you should stop approximating, you should for most realistic purposes- that's why Poincare bothered to introduce a rigorous theory of asymptotics and why the big O notation is widely used. It is not at fault, however while,

f(x) = a + bx + O(x^2)
is true,
f(x) = a + bx , is simply incorrect
f(x) ~ a + bx , is more proper. If say a+bx = g(x)
then obviously,
f(x)~ a+bx = g(x) and one may continue working on things from there.

Sloppiness is indeed a serious thing and the simple observation that 'it works' does not endear these methods any basis.

We underestimate the importance of language and notation. While a notion is the most important in itself and should not find notation a hurdle (as per Gauss), if communication is our precept(or post) then it is of utmost importance.

And I am very sorry, while I trust the paper was most interesting and probably not exactly 'faulty' in computations, it just isn't my problem. I am simply saying as some others are that when we have something we need to say where it came from with 'definition' and lack of that coupled with sloppy use of notation can really mess things up - for you not me.
 
  • #47
Maybe the best way to think about this discussion is that
the physicist is trying to tell a story about the real world [as best as he or she can] with the aid of a mathematical language.

It may be the case that physicist's story isn't told as precisely as it could be... but it impatiently gets the key points and the main conclusions [arguably, "the good stuff"] across. (The lack of imprecision also provides opportunity for others so inclined to fill in the gaps.) On those [rare?] occasions when the key points and the conclusions are led astray by sloppy misapplication of mathematics, certainly someone will come along and correct it... leading to a revised story with a cautionary tale.
 
  • #48
The physicist does not have the desire or the patience to rigorously establish that the math used is correct. If the math turns out wrong, an experiment will eventually come along to demonstrate this. And if it turns out to be correct, a mathematician will eventually come along to demostrate this. :biggrin: [1]

Much more often than not, math that has been used based on an intuition for its soundsness has eventually been shown to be sound. I think this way of dealing with the math has been helpful, to keep the theory progressing at the pace that it has. If the physicists stopped before differentiating every step function, things would move along a lot more slowly.

[1] recalling the words of my many-body theory prof
 
Last edited:
  • #49
yasiru89 said:
Sloppiness in notation and more importantly definition simply cannot be overlooked. I'm not telling you that you should stop approximating, you should for most realistic purposes- that's why Poincare bothered to introduce a rigorous theory of asymptotics and why the big O notation is widely used. It is not at fault, however while,

f(x) = a + bx + O(x^2)
is true,
f(x) = a + bx , is simply incorrect
f(x) ~ a + bx , is more proper. If say a+bx = g(x)
then obviously,
f(x)~ a+bx = g(x) and one may continue working on things from there.

Sloppiness is indeed a serious thing and the simple observation that 'it works' does not endear these methods any basis.

We underestimate the importance of language and notation. While a notion is the most important in itself and should not find notation a hurdle (as per Gauss), if communication is our precept(or post) then it is of utmost importance.

And I am very sorry, while I trust the paper was most interesting and probably not exactly 'faulty' in computations, it just isn't my problem. I am simply saying as some others are that when we have something we need to say where it came from with 'definition' and lack of that coupled with sloppy use of notation can really mess things up - for you not me.

That is why I asked you to look at many-body physics and derive the Fermi Liquid Theory or superconductivity. How exactly does one deal with a gazillion particle, all making such interactions?

When we write the potential for such many body problem, while we write an equality, no one in their right mind would ever, EVER, consider that this is exact. Maybe mathematicians who have no clue on what "many-body physics" is may get all knotted up when we write such a thing, but no physicists would be fooled into thinking that anything in there is "exact". That's like asking someone to keep all the significant figures that one gets out of a calculator!

When one deals with QFT and all those Feynman diagrams, one HAS to decide what LEVEL of interactions one has to include. Thus, the self-energy term, for example, has to be truncated using some criteria. Often, this criteria is based on what degree of accuracy that we can measure. It is useless to include all the high order interactions when there's no way those can be measured, or when such interactions produce no significant effect on anything. Thus, we can easily be justified to simply write the self-energy as an equality without even having to acknowledge that we ignore the higher-order terms. Simply by defining how one truncates the interaction is more than sufficient. That's why we show feynman diagrams in the first place! This is not sloppiness. This is physics!

Again, if this is such a major problem with the mathematics, write a rebuttal. All I have seen so far are nothing more than a matter of tastes. If these are errors in mathematics, then the results that are obtained should not be correct because they have made a serious logical error. I haven't seen such a thing being brought up in journal rebuttals. Either mathematicians who have problems with these are not speaking up, or they simply like to make fun of physicists but could not put their money where their mouths are. For something that is purportedly to be very prevalent, there seems to be nothing in the journals that points to such glaring errors. Why is that?

Zz.
 
Last edited:
  • #50
Math and physics shouldn't really be compared like this.

They're two different things. To physics, math is a tool, not a holy grail. We use mathematical language because we can express things to each other that way. We have other tools though, like experiment and qualitative analysis.

Satyendra Nath Bose only stumbled upon Bose-Einstein statistics because of a mathematical mistake he made during a lecture that ended up matching experimental results.
 
  • #51
Pythagorean said:
They're two different things. To physics, math is a tool, not a holy grail. We use mathematical language because we can express things to each other that way. We have other tools though, like experiment and qualitative analysis.
At the risk of stressing the analogy to its limits -- the problem is that there is a trend to reject the shiny new nailgun sitting on the next shelf in preference to using an ancient, rusty hammer that's barely held together with chewing gum and duct tape. They both do an adequate job of putting a nail into wood, but the hammer is more cumbersome and could fall apart unexpectedly.


ZapperZ said:
You seem to forget the important point here : it works!
No it doesn't; every time a prominent physicist says "nobody understands quantum mechanics", that's a strike against your thesis. Quantum mechanics has been around for over a century, and has very simple toys requiring only very elementary mathematics; there is no excuse for its simplest and most elementary notions to still be considered mysterious and unintuitive by its experts!

(of course -- I hope that my knowledge of prevailing opinion is behind the times, and that things have improved)
 
Last edited:
  • #52
Hurkyl said:
No it doesn't; every time a prominent physicist says "nobody understands quantum mechanics", that's a strike against your thesis.

I won't repeat my argument against this. You can http://physicsandphysicists.blogspot.com/2007/04/no-one-understands-quantum-mechanics.html" .

This attack against physicists now has taken a very strange turn far from the original post.

Zz.
 
Last edited by a moderator:
  • #53
zoobyshoe said:
"What do you mean, funny? Let me understand this cause, I don't know maybe it's me, I'm a little ****ed up maybe, but I'm funny how? I mean, funny like I'm a clown, I amuse you? I make you laugh... I'm here to ****in' amuse you? What do you mean funny, funny how? How am I funny?"

I would not mess with the Newton.

animalcroc said:
I think that's way overrated.
"rigor" in math came after the giants who developed modern math in the first place, and they're "giants".

There's no way of knowing what the giants would say in this thread.

As one can guess from my OP, IMO the physicists' way could get a little bit more criticism, than it is receiving now. Even though it is true that the details of rigorous proofs can be complicated, the ideas are usually simple. I have noticed, that when physicists explain mathematical things with their "physical arguments", the ideas can get quite incomprehensible. I'm not criticizing use of approximations. I'm criticizing making illogical conclusions and making people get used to them. IMO the physicists' way is "not working as well as it looks from inside". Well I know that's a student's opinion against huge amount of scientists, but here's one related quote:

http://aleph0.clarku.edu/~djoyce/hilbert/problems.html

Besides it is an error to believe that rigor in the proof is the enemy of simplicity. On the contrary we find it confirmed by numerous examples that the rigorous method is at the same time the simpler and the more easily comprehended. The very effort for rigor forces us to find out simpler methods of proof. It also frequently leads the way to methods which are more capable of development than the old methods of less rigor.
 
  • #54
I can't really put it any better than Hilbert, and Zz- I don't aim to stereotype the entire landscape of physics, merely remark on the general trend hovering over much of today's physics(as far as Arxiv goes anyway- my readings are also very occassional). This is all the more evident when someone can say (and I quote) "..physics...where every function equals the first few terms of its Taylor expansion.." and people aren't appalled by it.

The 'giants' as you call them were indeed quite 'rigorous' for their times, Euler's foresights into the realms of rigour weren't all that well considered (perhaps because he sometimes picked up a habit of Leibnitz and brought metaphysics to the mix). Even in his doodlings with divergent series he hints at interpreting his results as representations. A basis not only of the mathematical theory but for things as physically 'profound' as Quantum Field Theory.
In fact I'd say Euler paved the way towards the rigour so vehemently stuck to by Cauchy et al. His work hinted at definition instead of thinking about something as 'that' thing, (
(which gave things an aura of mystery and divinity in their very existence and deterred certain individuals from serious considerations)

Also at the risk of over assumption I believe what Hurkyl meant was not in anti thesis of your rather well played out argument ZapperZ. Indeed we fully 'understand' a theory when it can be put in completely mathematical terms (an intuitive grasp may be helpful but it isn't everything- certainly not absolute comprehension) and we actually have that to a rather large extent in quantum theories. What 'nobody understands quantum mechanics' really connotes is a general reluctance to accept the implication of it, something of which one could relate with Einstein. Also quantum theories don't exactly fit perfectly with observations do they(this is taking into account uncertainty and limited measurement ability)? This is because it is an asymptotic theory as are so many in physics. Then there's the very word 'mysterious', this need not imply that you know absolutely nothing of a theory or its application, only that one has trouble coping with implications. Where intuition ceases to be of very great use - The collapse of determinism.
 
  • #55
Oh I refer to 'your rather well played out' argument in the URL you provided.
 
  • #56
I really don't see this going anywhere, and it certainly isn't productive from my perspective. Why?

1. I still haven't gotten any concrete example on where this really is a huge problem, to the extent that it can induce physical sickness.

2. I also haven't seen any concrete evidence that the physics that was done has been compromised due to such "mathematical practice". All I have seen are hand-waving speculation that such-and-such can happen and shouldn't happen. I've even brought up specific physics for someone to point out where exactly is the fault in the end result. I had no takers.

3. Because of #2, all the arguments so far have been nothing more than a matter of tastes. So you like the color green better while I like purple.

4. This thread now has become an issue on ... er... quantum mechanics? And what it actually means?! All because of some seemingly sloppy differentials?

I can tackle a lot of issues when the issue itself is clearly presented, but not in this thread that seems to meander from an outright attempt at dissing physicists to the validity of quantum mechanics. Up to this point, no one has managed to actually pin-point faulty physics that is a result of the mathematics being used, something that I've asked for repeatedly.

I'm sorry that I actually wasted any effort and intruded into this thread in the first place.

Zz.
 
  • #57
I found the article I was looking for:

http://www.theorie.physik.uni-muenchen.de/~serge/why_physics_is_hard.pdf


When I think of "not understanding quantum mechanics", I think of opinions like "delayed choice quantum erasure is mysterious". This is certainly a popular opinion, but I don't know how much of it is due to non-physicists, how much is due to physicsts trying to popularize the subject, and how much is due to physicists actually believing it.

Of course, I would like to believe that most physicsts1 find this, and other sorts of things, to be natural rather than mystifying -- would I be correct to interpret your essay in that fashion?

1: Note that I'm not asking about the "very best quantum physicists", but your typical quantum physicist. I should probably even include good graduate students of the subject.
 
Last edited by a moderator:
  • #58
Thanks for that article. I can certainly appreciate being confused about mathematics in physics courses heh.

However, I don't know how much this has to do with the mathematics in physics being complainable. Its seems more to be about how it may need some better math instruction which I have some ideas about, but that's for a different thread
 
  • #59
Mororvia said:
Thanks for that article. I can certainly appreciate being confused about mathematics in physics courses heh.

However, I don't know how much this has to do with the mathematics in physics being complainable. Its seems more to be about how it may need some better math instruction which I have some ideas about, but that's for a different thread
The opening posters differential example is very much in spirit with the Green's function example in the article I linked (but much more elementary). I think what yasiru89 was saying about replacing rigour with "intuitive obviousness" is the same sort of thing.
 
  • #60
Integral said:
AFAIK nearly all mathematicians laugh at the way Physicists do math. Of course, not nearly as hard as they laugh at the way engineers do math.
But this is so unsurprising that it is not discussed outside of Math dept coffee rooms. :smile:

oh man, have you seen much of continuum mechanics? It's way cool. It's an engineering dicipline, with amazingly complicated math. This doesn't say much, but I'm quite impressed at the formulation. You can take this subject to a mathematical extreme.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 34 ·
2
Replies
34
Views
7K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
383