View Single Post
Nov18-04, 01:47 PM
P: 171
Well, in order to understand what dx and d/dx is you have to really understand the concept of calculus.

The cruicial thing that calculus is based on is the concept of a limit. The two basic operations in calculus are differentiation and integration, each of which are defnined in terms of a limit.

Take for example, the function (*If you don't understand what a function is, then just replace f(x) with y, for now*):

f(x) = [tex]x^2 - 4x + 5[/tex]

At x = 2, f(2) = 1. In calculus a completely different consideration is taken into account, we are more concerned with what number does f(x) = [tex]x^2 - 4x + 5[/tex] get closer and closer to as x gets closer and closer to 2?

When x is 1.9 or 2.1, f(x) = 1.01
When x is 1.99 or 2.01, f(x) = 1.0001
When x is 1.999 or 2.001, f(x) = 1.0000001
and so on

So we can say the limit of f(x) = [tex]x^2 - 4x + 5[/tex] as x approaches 2 is 1

Or as x approaches 2, f(x) approaches 1

Now what the Hell does this have to do with d/dx?

A derivative(d/dx) of a function is its instantaneious rate of change (or slope). If I could draw you a graph of some curve then you would be able to draw a line called a tangent. A line is tangent to a curve if it touches the curve at exactly one point.

But how can you draw a line with just one point? Well, lets say you have two points, but they were really, really, really close to each other. The distance between them (dx) would not be exactly 0, but approaching 0 and so they could be considered one point.

There is an equation for the limit of this, which is the definition of a derivative but I won't really bother with it.

Here is where you got all confused: there are many different notations a derivative of a function.

For example f'(x) and d/dx.

dy/dx or d/dx is not read as d divided by dx. dy is an approximation of the change in y (or if just d, any variable) when dx, the change in the x, is really, really small. Which is why it follows that if you squared dx it wouldn't make any difference. If dx = 0.000000000000000000000000000000000000000000000...0001 and you squared that number it would still be pretty much 0.

I hope that helps. The concepts of calculus can be confusing, as calculus is always dealing with the concepts of infinitly large, and infinitly small numbers; and well that can be infinitly frustrating at times.