## I am having problems with lots of rigor.

Hello everyone,

Essentially, I am looking for advice as to what to do/how to improve/cope. It is my first time learning calculus and I am working through Spivak's Calculus. For the first few chapters the problems weren't that hard (up through derivatives and the fundamental theorem of calculus). Recently (i.e. the chapter on calculating integrals) the problems have gotten much more difficult and I cannot solve the ones that do not involve something beyond ordinary computation.

In general, I'm having a problem with mathematical rigor: I find it difficult and unmotivated to prove things that are obvious to me. For example, the cauchy schwarz inequality: it should be obvious that the dot product of two vectors is less than or equal to the products of their lengths, but proving it (at least for me, who doesn't know linear algebra) is nonintuitive and I can't remember it. Ultimately, I feel like I'm losing my confidence in my ability to do math.

By the way, I'm a high school student going into junior year. What advice do you have? Should I just move on to physics? My intuition behind the math I know is very good.

 PhysOrg.com science news on PhysOrg.com >> 'Whodunnit' of Irish potato famine solved>> The mammoth's lament: Study shows how cosmic impact sparked devastating climate change>> Curiosity Mars rover drills second rock target
 Is your main interest in math or physics. If its in physics don't worry about rigor. Their are math proofs in physics.
 Blog Entries: 8 Recognitions: Gold Member Science Advisor Staff Emeritus First of all, if your only goal is to do physics then you don't need to know Spivak's calculus. Spivak's calculus is quite a difficult math text and is more real analysis than calculus. You should do Spivak if your interest is in learning rigorous math and learning the foundations of the math you use. If your only interest is in solving physics questions, then Spivak is not necessary. It is true that rigorous math texts sometimes focus on proving obvious things. If you feel that this is unnecessary, then you're missing the whole point of mathematics. Mathematics provides a system of axioms and tries to derive entire mathematics from that. For example, Spivak uses in his entire book only 13 axioms and derives entire calculus from that. The point in proving elementary things is to show that our axioms are good enough. For example, it should be obvious that $f(x)=x^2$ is continuous. The rigorous epsilon-delta definition of continuity can be used to show that this is continuous. If we were not able to show that f were continuous using epsilon-delta, then this shows us that there is something wrong with our definition of continuity. We did not invent epsilon-delta to prove that silly functions like $f(x)=x^2$ are continuous. We invented it to prove more advanced stuff that isn't intuitive. The whole point of showing that f is continuous is to show that our current definitions do correspond with our intuition: that is, we get the results we intuitively want. Eventually, there will come a time that you want to prove a mathematical result that might not be intuitively true. If you want to be succesful at this, you better have the correct axioms and definitions. One way to make sure you have the correct machinery is to prove the elementary results. The same with Cauchy-Schwarz. It is a result we intuitively want to be true. If we were not able to show this, then we must get different axioms. Also, Cauchy-Schwarz isn't as obvious as it may appear. Do you think that $$\int_a^b f(t)g(t)dt \leq \sqrt{\int_a^b f(t)^2dt}\sqrt{\int_a^b g(t)^2dt}$$ is obvious? I don't think it's very obvious. Nevertheless, its prove is very analogous to the prove of the Cauchy-Schwarz inequality. So if you can prove an elementary result, then maybe the proof of the elementary result can be used to prove more advanced stuff like this integral version of Cauchy-Schwarz.

## I am having problems with lots of rigor.

I like math, and physics as well - I just don't really care too much about the precision that is required required in analysis, nor am I good dealing with it.

Essentially the problem comes down to this:

In trying to find the surface area of the curve created by the function f, from a to b, "rotated" about the x-axis (http://www.math.wpi.edu/Course_Mater...rev/node1.html), you can partition the interval into $$\{a=t_0, t_1, ... , t_n = b \}$$, then using some simple geometry (which Spivak considers "fudging"), you can approximate the surface area by the surface areas of successive frustums, a frustum for each interval $$[t_{i-1},t_i]$$. The sum is

$$\pi \sum_{i=1}^n [f(t_{i-1}) + f(t_i)]\sqrt{[f(t_i) - f(t_{i-1})]^2 + (t_i - t_{i-1})^2} = \pi \sum_{i=1}^n [f(t_{i-1}) + f(t_i)]\sqrt{f'(x_i)^2 +1}(t_i - t_{i-1})$$

Justified by the mean value theorem. I'm sure that you could use uniform continuity and then show that this is a riemann sum of some sort, and get to the result. But for me, it's totally satisfactory (and much more efficient) to just say that as $$t_i - t_{i-1}$$ gets small, then $$f(t_i) + f(t_{i+1})$$ is essentially equal to $$2f(t_i)$$, and a similar thing for the $$f'(x_i)^2$$, and the sum becomes $$2\pi\int_a^b f(x)\sqrt{f'(x) +1} dx$$.

Does this mean that math isn't for me? Or does this just mean I'm not as into analysis? Does this same kind of thing carry over into other domains like algebra/combinatorics?

 Quote by micromass First of all, if your only goal is to do physics then you don't need to know Spivak's calculus. Spivak's calculus is quite a difficult math text and is more real analysis than calculus. You should do Spivak if your interest is in learning rigorous math and learning the foundations of the math you use. If your only interest is in solving physics questions, then Spivak is not necessary. It is true that rigorous math texts sometimes focus on proving obvious things. If you feel that this is unnecessary, then you're missing the whole point of mathematics. Mathematics provides a system of axioms and tries to derive entire mathematics from that. For example, Spivak uses in his entire book only 13 axioms and derives entire calculus from that. The point in proving elementary things is to show that our axioms are good enough. For example, it should be obvious that $f(x)=x^2$ is continuous. The rigorous epsilon-delta definition of continuity can be used to show that this is continuous. If we were not able to show that f were continuous using epsilon-delta, then this shows us that there is something wrong with our definition of continuity. We did not invent epsilon-delta to prove that silly functions like $f(x)=x^2$ are continuous. We invented it to prove more advanced stuff that isn't intuitive. The whole point of showing that f is continuous is to show that our current definitions do correspond with our intuition: that is, we get the results we intuitively want. Eventually, there will come a time that you want to prove a mathematical result that might not be intuitively true. If you want to be succesful at this, you better have the correct axioms and definitions. One way to make sure you have the correct machinery is to prove the elementary results. The same with Cauchy-Schwarz. It is a result we intuitively want to be true. If we were not able to show this, then we must get different axioms. Also, Cauchy-Schwarz isn't as obvious as it may appear. Do you think that $$\int_a^b f(t)g(t)dt \leq \sqrt{\int_a^b f(t)^2dt}\sqrt{\int_a^b g(t)^2dt}$$ is obvious? I don't think it's very obvious. Nevertheless, its prove is very analogous to the prove of the Cauchy-Schwarz inequality. So if you can prove an elementary result, then maybe the proof of the elementary result can be used to prove more advanced stuff like this integral version of Cauchy-Schwarz.
Thank you for the response, Micromass. I understand what you are saying - I guess I just need to look at more motivating examples, such as the integral version of the Cauchy-Schwarz inequality. For me, it is not obvious either.

 Quote by micromass Also, Cauchy-Schwarz isn't as obvious as it may appear. Do you think that $$\int_a^b f(t)g(t)dt \leq \sqrt{\int_a^b f(t)^2dt}\sqrt{\int_a^b g(t)^2dt}$$ is obvious? I don't think it's very obvious. Nevertheless, its prove is very analogous to the prove of the Cauchy-Schwarz inequality. So if you can prove an elementary result, then maybe the proof of the elementary result can be used to prove more advanced stuff like this integral version of Cauchy-Schwarz.
If you make the leap between finite and infinite sums, the connection is not that hard to see.

Blog Entries: 8
Recognitions:
Gold Member
Staff Emeritus
 Quote by chiro If you make the leap between finite and infinite sums, the connection is not that hard to see.
Well, if you're happy with that explanation...

 Quote by micromass Well, if you're happy with that explanation...
A riemann sum is an infinite sum, just as a valid sum in l2(R) (little l) is an infinite-sum. When you end up going from there to L^2(R) and you do all the formal stuff, the connection still remains the same when the template of the vector spaces and geometry is used but instead of dealing with finite-sums (and finite-dimensional spaces), you are now dealing with infinite ones, with special properties (convergence of the inner products being a big one, but the nature of infinity makes things really nasty in other ways).

Blog Entries: 8
Recognitions:
Gold Member
Staff Emeritus
 Quote by chiro A riemann sum is an infinite sum, just as a valid sum in l2(R) (little l) is an infinite-sum. When you end up going from there to L^2(R) and you do all the formal stuff, the connection still remains the same when the template of the vector spaces and geometry is used but instead of dealing with finite-sums (and finite-dimensional spaces), you are now dealing with infinite ones, with special properties (convergence of the inner products being a big one, but the nature of infinity makes things really nasty in other ways).
That's a good intuition, but it's not a good rigorous explanation. Not every result about finite sums is true for integrals.
Also, the integrals in $L^2(\mathbb{R})$ are not Riemann sums.

 Quote by micromass That's a good intuition, but it's not a good rigorous explanation. Not every result about finite sums is true for integrals. Also, the integrals in $L^2(\mathbb{R})$ are not Riemann sums.
Yes true, but even then you can still use the characteristic function to get the right analog (in terms of a "sum" and not a Riemann sum so to speak). After all, that's all an integral is: you sum stuff with respect to some measure: so if it's not an area like a Riemann, but it's more like a Riemann-Stieltjes Integral or a Lebesgue integral, then you find the analog of where the sums make the correspondence.

A Riemann is a lot easier than the general L^2(R), but you can make the same analog for the other spaces.

 AlwaysCurious, I wouldn't worry about it. You are a high school junior and tackling Spivak on your own. That is quite a high level! If you are finding it difficult take a break and study something else for a while. What you find boring now may become very interesting to you in the future. I hated math in High School and did poorly in it but after I got out of the military 8 years later I am pursuing a dual degree in Math and Electrical Engineering. I have loved my Analysis courses but in my opinion I enjoyed and benifited from it far more by having a Professor guide me through it. Don't discount mathematics from your future just because it's boring. Make sure you take an analysis course in college and see how you feel about it after that.

 Quote by chiro If you make the leap between finite and infinite sums, the connection is not that hard to see.
Not to step in on the argument of sorts, but yeah after seeing the integral form of cauchy schwarz I thought about it - it should be intuitive in terms of riemann sums.

 Quote by srl17 AlwaysCurious, I wouldn't worry about it. You are a high school junior and tackling Spivak on your own. That is quite a high level! If you are finding it difficult take a break and study something else for a while. What you find boring now may become very interesting to you in the future. I hated math in High School and did poorly in it but after I got out of the military 8 years later I am pursuing a dual degree in Math and Electrical Engineering. I have loved my Analysis courses but in my opinion I enjoyed and benifited from it far more by having a Professor guide me through it. Don't discount mathematics from your future just because it's boring. Make sure you take an analysis course in college and see how you feel about it after that.
Thank you for the advice - it is something similar to what I'm thinking at the moment. Maybe a love of precision will come with maturity.

Also, to go on the record, I do like math - it's just recently the demands of Spivak have been a bit much.

 Quote by AlwaysCurious Does this same kind of thing carry over into other domains like algebra/combinatorics?
While micromass' description of why rigour is important is true for all of mathematics, each field has its own distinct flavour. Personally, I have always found analysis to be less intuitive than algebra or combinatorics. Even (point-set) topology, which is really abstracted analysis, was easier for me.

I blame the evil real number system...

Take home message: Don't get discouraged if some parts of math are a bit harder than others. The rigour gets easier as you see the connections between the different fields. This is the mythical "mathematical maturity" that just takes time and work to gain.

Blog Entries: 8
Recognitions:
Gold Member