- #1
brocks
- 181
- 3
I'm an aspiring physicist who is just starting out with freshman courses, so let me apologize in advance if I say something really dumb, or unintentionally offend someone.
I found this forum several weeks ago, and in my reading of posts from mathematicians, I got the idea that "real math," as opposed to computation, begins with analysis courses.
I was reading my intro calculus text tonight, and it was proving that (x^r)' = rx^(r-1) for any real r, as opposed to just integer or rational r. The proof began by assuming that x^r is differentiable, and that the laws of logarithms hold for real exponents.
It sort of hit me that those were major and important assumptions, and would be hard to prove. And that an analysis class is where you would learn to prove them. And that I didn't need to know how to do that --- I see that it is very important to prove those assumptions, but I am perfectly happy to accept that pure mathematicians have proved them.
So am I correct in thinking that taking an analysis class to prove such assumptions would be akin to spending time doing all kinds of experiments to prove that energy and momentum really are conserved, i.e. not the best use of my time?
I see how important it is that *somebody* does it, but I'm thinking that once competent people have done it, there is no need for me to repeat it, at least not for hours and hours. It would be nice to do if I had infinite time, but it would not help my understanding of physics as much as spending the same amount of time on actual physics.
Obviously, I can see that learning rigorous proof techniques can help with any subject, and obviously I can see that going too far with "let somebody else do it" can result in superficial knowledge, but I'm assuming that the people who run universities make sure that the physics curriculum includes enough rigor and foundational material to handle that.
Is this a correct and practical attitude to take *for a physicist*, or am I missing something important?
I found this forum several weeks ago, and in my reading of posts from mathematicians, I got the idea that "real math," as opposed to computation, begins with analysis courses.
I was reading my intro calculus text tonight, and it was proving that (x^r)' = rx^(r-1) for any real r, as opposed to just integer or rational r. The proof began by assuming that x^r is differentiable, and that the laws of logarithms hold for real exponents.
It sort of hit me that those were major and important assumptions, and would be hard to prove. And that an analysis class is where you would learn to prove them. And that I didn't need to know how to do that --- I see that it is very important to prove those assumptions, but I am perfectly happy to accept that pure mathematicians have proved them.
So am I correct in thinking that taking an analysis class to prove such assumptions would be akin to spending time doing all kinds of experiments to prove that energy and momentum really are conserved, i.e. not the best use of my time?
I see how important it is that *somebody* does it, but I'm thinking that once competent people have done it, there is no need for me to repeat it, at least not for hours and hours. It would be nice to do if I had infinite time, but it would not help my understanding of physics as much as spending the same amount of time on actual physics.
Obviously, I can see that learning rigorous proof techniques can help with any subject, and obviously I can see that going too far with "let somebody else do it" can result in superficial knowledge, but I'm assuming that the people who run universities make sure that the physics curriculum includes enough rigor and foundational material to handle that.
Is this a correct and practical attitude to take *for a physicist*, or am I missing something important?