- #1
Valour549
- 57
- 4
I understand that 0.9999... = 1 is true because in limit theory "getting arbitrarily close" means that they actually are equal. I'm also aware of the epsilon-delta definition of limits. But I feel like there are some inconsistencies in this concept; I'll explain using two scenarios.
Case 1: lim(X → 0) X = 0
When X gets arbitrarily close to zero... the function (which is actually just X as well) is equal to zero.
Case 2: lim(h → 0) [(X+h) - X] / h = h / h = 1
Here we argue that the derivative of the function X is 1. Because even though h is arbitrarily close to zero, it isn't equal to zero. So h / h is defined and is equal to 1.
***
This seems inconsistent; in order to keep in line with the concept that h "getting arbitrarily close" to zero means that h actually is equal to zero, shouldn't h / h really just be 0 / 0 and hence undefined?
Furthermore, it is also commonly know that in the topic of limits, 0 / 0 is considered one of the indeterminate forms, so why is this derivative even technically solvable?
Case 1: lim(X → 0) X = 0
When X gets arbitrarily close to zero... the function (which is actually just X as well) is equal to zero.
Case 2: lim(h → 0) [(X+h) - X] / h = h / h = 1
Here we argue that the derivative of the function X is 1. Because even though h is arbitrarily close to zero, it isn't equal to zero. So h / h is defined and is equal to 1.
***
This seems inconsistent; in order to keep in line with the concept that h "getting arbitrarily close" to zero means that h actually is equal to zero, shouldn't h / h really just be 0 / 0 and hence undefined?
Furthermore, it is also commonly know that in the topic of limits, 0 / 0 is considered one of the indeterminate forms, so why is this derivative even technically solvable?