# How much rigor upon first exposure?

1. Aug 7, 2013

### CuriousBanker

I took Calc I in college but i was an economics/business major so it was probably a joke like the rest of business school. Anyway, I'm 24 and self-teaching myself now through a combination of Strangs text, khan academy, and various youtube videos/websites.

So I've been self-teaching myself for 2.5 months, but in that time I started a new job, so I would say I've really only been teaching myself for 4 weeks, because 6 of those weeks I wasnt really into it because I was busy. So far, I think I have a solid grasp on limits, derivatives (power rule and chain rule), l'hopitals rule, optimization, and some on integration. Intuitively most of these things click with me. Like, for instance, regarding the second fundamental theorem of calculus; if f(t) is distance and t is time, f(b)-f(a) is distance traveled over some time. The derivative is velocity. if you add the area under the curve of the velocity graph, its every infinitesimal velocity times the amount of time for that speed, then that gives you total distance traveled, which is the antiderivative. Intuitively it makes perfect sense to me.

However, if somebody were to ask me to prove l'hopitals rule, or pretty much any of these things, I would crap myself. I haven't had any formal math training in like 6 years, and even then my background is public school high school combined with one math class in college.

So my question is, is this a huge problem? I 100% understand the need for rigor in math. My plan is to teach myself calc 1-3, then probability theory, then linear algebra, then partial and ordinary diffy q's, then stochastic calculus, then analysis, then number theory. So what I am wondering is, is it okay for the first time being exposed to "higher level math" to be unable to do proofs and rigor, and just get the intuition? Will I pick up on the rigor part as I move up in difficulty? Or should I not even continue on this path until I can rigoursly prove everything I am doing? Were you able to do rigorous proofs upon your first exposure to calculus?

2. Aug 7, 2013

### Mandelbroth

I was pretty much the same way when I started calculus. To start, I would suggest reading proofs done by other people in order to get an idea of how the concepts work. Once you've read a few proofs, try to prove a theorem or two on your own, until you get confident. To start, you don't need to derive all the concepts for yourself. When you get higher up, you'll see that it's actually more fun to verify results for yourself. The fact that you already understand the necessity of rigor is a huge step in the right direction.

However, and this is important, you can expect a point where the math will become overwhelming. At this point, your best friend will be rereading. When math becomes difficult, it is important to remember that rigor can be more helpful than intuition, because intuition can and will be wrong.

If you have any questions along the way, feel free to visit us. I'd be happy to help, and I'm sure many others would as well.

3. Aug 7, 2013

### CuriousBanker

Thanks for the support. Another example is the extreme value theorem. Intuitively it is extremely obvious to me. However whenever I try to look up a proof, I either just get a bunch of examples (which are not proofs), or the person will state "the proof for this requires a lot of way more advanced mathematics".

So is it okay to just understand and apply it for now, and come back to the proof when doing analysis? Or what?

4. Aug 7, 2013

### CuriousBanker

Also are there proofs for the trig values, like sin of any given value being the same for all similiar triangles?

5. Aug 7, 2013

### micromass

Staff Emeritus
Don't worry too much about it. Nobody expects you to be able to prove l'Hospitals rule at this stage.

The most important part is that you get an intuition for the concepts and that you're able to compute things.

You should be able to prove easy things like what the derivative of $f(x) = x^n$ is, or the substitution rule for integrals. Those are things you should know how to do.

As for more difficult things, such as l'Hosptals rule, the mean value theorem, the fundamental theorem of calculus, etc.: I think it's important that you read the proofs and that you understand the proof completely. But don't worry about reproducing it, or about proving it yourself.

Later, when you do analysis, you should be able to prove more complicated things. But for now, you should focus on gaining intuition and practice.

6. Aug 7, 2013

### Stephen Tashi

I'd say its not a problem that you can't prove important theorems "off the top of your head". Many people whose focus is applied math don't every learn to be fluent with proofs, especially "epsilon-delta" proofs. If your long term goal is to go into pure math (or the highly rigorous approaches to applying math to finance) then you should definitely try to understand any proofs that are given or assigned in the text.

7. Aug 9, 2013

### CuriousBanker

My goal is mostly just learning to learn. I try to get the ones in the text but I can't do them alone yet

I have a question regarding limits, gonna ask it here so i don't have to make a whole new thread

For the function 2/1-x, the limit as x approaches 1....is the answer infinity, or undefined? Because at x=1 it's undefined but as x approaches 1 but doesn't actually touch it, the function goes towards infinity.

8. Aug 9, 2013

### CuriousBanker

Is there a way to prove that 1/x limit as x goes to infinity is zero? It's obvious but is there a proof?

9. Aug 9, 2013

### Mandelbroth

I'm assuming you mean $\displaystyle \lim_{x\rightarrow 1}\frac{2}{1-x}$. What you wrote was $\displaystyle \lim_{x\rightarrow 1}\frac{2}{1}-x$, which is most definitely 1. :tongue:

The limit is "undefined," though I prefer to say that it doesn't exist. As $x$ approaches 1, $\frac{2}{1-x}$ does increase or decrease without bound (id est, it "goes to infinity"). However, we define the limit using something called the "epsilon-delta" definition.

In really concise math-speak, we say that $\displaystyle \lim_{x\rightarrow a}f(x)=L$ if and only if $$\forall\varepsilon>0~\exists \delta>0:\forall x (0<|x-a|<\delta\implies |f(x)-L|<\varepsilon).$$ What this says, intuitively, is that we can make the distance from $f(x)$ to $L$ arbitrarily small. In this case, there is no such $L$ that satisfies our definition of the limit.

Last edited: Aug 9, 2013
10. Aug 9, 2013

### micromass

Staff Emeritus
The function can never go to infinity.

There is a simple reason. Indeed, $x$ can approach $1$ from two sides. If we approach from the left side, that is: if $x<1$, then $\frac{2}{1-x}$ is always negative. In fact, the left-side limit goes to $-\infty$.

But if $x>1$, then $\frac{2}{1-x}$ is always positive. And the right-side limit goes to $+\infty$.

Thus the left-side and right-side limits are not equal. Thus the limit is undefined.

If you had an example like $\frac{2}{(1-x)^2}$, then the limit would be $+\infty$. Because both left sides and right sides go to $+\infty$.

11. Aug 9, 2013

### micromass

Staff Emeritus
Have you ever learned $\epsilon$-$\delta$ proofs?

12. Aug 9, 2013

### lurflurf

First exposure? You said it was second exposure. If you are interested in theory you should include as much as is practical. What I meant by this is you should aim to understand each concept as completely as possible in reasonable time. You should pick two books one a bit easier and one a bit harder. Read the corresponding sections in each book. If you understand at the higher level great. If not you have an idea of what you lack and you can come back to it later. If you demand high level understanding of each topic it will likely be frustrating, slow, and inefficient. Often something that seems hard to understand today will be easy to understand later when you are ready for it. At the same time it is frustrating and confusing to accept a vague explanation if you are primed for a clear one. At the very least this approach keeps you aware of where you are. Some of the easier books make it unclear if what they omit is actually hard or not. Some of the omitted things are confusing, others are simple, and some will help you understand better. I see no reason to not at least glance at all the proofs and example for perspective. Trying to actually understand everything at once is not recommended.

13. Aug 10, 2013

### CuriousBanker

Yeah, and after I posted that question, I realized that was the obvious answer...doh!

You're right, but my first exposure I didn't care about learning, I just cared about passing the tests, so I don't count that. Plus it was calc 1 for business students...basically a joke

Sounds reasonable
So as x approaches 1, the limit is positive infinity then, even though at x = 1, its undefined, right?

14. Aug 10, 2013

### micromass

Staff Emeritus
If you're talking about the $\frac{2}{(1-x)^2}$ example, then the limit as $x$aproaches $1$ is $+\infty$. And it is indeed undefined at $1$.

15. Aug 12, 2013

### CuriousBanker

How do we know that the limit is positive infinity though? Using epsilon-delta we can only keep getting larger and larger numbers....is there a way to "prove" that it is infinity, or is it just obvious?

16. Aug 12, 2013

### Stephen Tashi

There is a special definition for a limit being "plus infinity". It is not the same as definition for $lim_{x \rightarrow a} f(x) = L$ where $L$ is a number. Before you can understand the proof you need to understand the definition of what is to be proved.