- #1

- 48

- 4

## Main Question or Discussion Point

I understand that 0.9999.... = 1 is true because in limit theory "getting arbitrarily close" means that they actually

When X gets arbitrarily close to zero... the function (which is actually just X as well)

Here we argue that the derivative of the function X is 1. Because even though h is arbitrarily close to zero, it

***

This seems inconsistent; in order to keep in line with the concept that h "getting arbitrarily close" to zero means that h actually

Furthermore, it is also commonly know that in the topic of limits, 0 / 0 is considered one of the

*are*equal. I'm also aware of the epsilon-delta definition of limits. But I feel like there are some inconsistencies in this concept; I'll explain using two scenarios.__Case 1__: lim_{(X → 0)}X = 0When X gets arbitrarily close to zero... the function (which is actually just X as well)

*is equal to*zero.__Case 2__: lim_{(h → 0)}[(X+h) - X] / h = h / h = 1Here we argue that the derivative of the function X is 1. Because even though h is arbitrarily close to zero, it

*isn't equal to*zero. So h / h is defined and is equal to 1.***

This seems inconsistent; in order to keep in line with the concept that h "getting arbitrarily close" to zero means that h actually

*is equal*to zero, shouldn't h / h really just be 0 / 0 and hence undefined?Furthermore, it is also commonly know that in the topic of limits, 0 / 0 is considered one of the

*indeterminate forms*, so why is this derivative even technically solvable?