I What is the logarithm of the derivative operator?

Anixx
Messages
88
Reaction score
11
I found this article which claims to have found the logarithm of derivative and even gives a formula.

But I tried to verify the result by exponentiating it and failed.

Additionally, folks on Stackexchange pointed out that the limit (6) in the article is found incorrectly (it does not exist).

I asked this question on Mathoverflow, and received an answer, based on the paper, but other folks say the answer is incorrect (for instance, the integral in the second term has no principal value).

So, my question is, can we really find the logarithm of derivative? Another related question.
 
Physics news on Phys.org
Where do you find such articles, aka nonsense?

A differential operator turns a multiplication into something linear via the Leibniz rule. Now they consider the logarithm on something linear, but the logarithm does the same as the differential operator, because it makes a multiplication an addition. Applying one on the other is completely unnatural, which results in the fact that a) this paper seems to be unpublished, b) treats only trivial examples, and c) is an analytical kind of numerology.

The answer to your question is no.
 
  • Like
Likes berkeman
While I think the article is indeed incorrect, I do not see anything wrong in finding elementary functions of derivative. There is no problem finding sine, cosine, tangent, arctangent, exponential of derivative. The ##\ln (D+1)## also can be found quickly. An interesting question though, what do these functions mean.

For instance,

$$(\sinh D)f(x)=\frac{f(x+1)-f(x-1)}2$$
$$(\cosh D)f(x)=\frac{f(x+1)+f(x-1)}2$$
$$\frac1{e^D-1}f(x)=\sum_{k=0}^{x-1} f(k)$$

etc.
 
You can write formulas as many as you like, that does not create sense. The logarithm of a derivation is
$$
\log(d(f\cdot g))=\log(f\cdot dg+df \cdot g)
$$
and there is where all ends. There is no way to make sense of ##\log(f+g)## other than ##\log(f+g)##. Playing around is immature.
 
I have obtained the same result as in the article using this code for Mathematica, starting from first principles:
Code:
Table[DSolveValue[{f'[x] + s f[x] == x^n, f[0] == 0}, f[x], x] // FullSimplify, {n, 0, 5}]
Integrate[%, s] // FullSimplify
Limit[%, s -> 0, Direction -> "FromAbove"] // FullSimplify
Table[Coefficient[%, x, k], {k, 0, 5}] // Expand // MatrixForm
 
I believe OP is confusing the “logarithm of a derivative,” and the logarithm of the differentiation operator. The formulas in post #3 are obtained by exponentiating the differentiation operator in a formal series, which can be applied termwise. The result in the article can probably be recovered via the same means.

Exponentiating the result does not give you the derivative; instead, you would have to invert ##\log(D)##, although this may not be able to be done uniquely.
 
Anixx said:
$$(\sinh D)f(x)=\frac{f(x+1)-f(x-1)}2$$
$$(\cosh D)f(x)=\frac{f(x+1)+f(x-1)}2$$
$$\frac1{e^D-1}f(x)=\sum_{k=0}^{x-1} f(k)$$

I don't get these at all. The derivative is a local function. I could pick a function f which is everywhere analytic in ##\mathbb{R}##, f and all its derivatives are 0 at 0, but f(1) and f(-1) are arbitrary numbers. So how can those first two formulas be true?
 
Expanding the expression ##e^{hD}f(x)## termwise results in the sum ##e^Df(x)=\sum_{k=0}^\infty\frac{f^{(k)}(x)}{k!}h^k.## One can see that this is equivalent to the Taylor series of ##f## about a point ##x##, evaluated at ##x+h##: $$\begin{align}f(x+h)&=\frac{f^{(k)}(x)}{k!}((x+h)-x)^k \\ &=\frac{f^{(k)}(x)}{k!}h^k.\end{align}$$ Convergence generally depends on choices of ##x## and ##h##, though, unless the function has a globally defined Taylor series. To get results that makes sense, one can restrict themselves to a domain where ##D## is bounded.

@Office_Shredder, in the case you mentioned, any analytic function on all of ##\mathbb{R}## whose derivatives are 0 at 0 is a constant function, because its Taylor series at 0 has an infinite radius of convergence. So the equations hold in this case.

OP mentioned the equation ##\frac{1}{e^D-1}f=\sum_{k=0}^{x-1} f(k)##, but I think this only holds under more stringent conditions.
 
Derivative is operator, not function.
$$e^{aD}f(x)=f(x+a)$$
From this follow the formulas above.
 
  • #10
  • #11
suremarc said:
.

@Office_Shredder, in the case you mentioned, any analytic function on all of ##\mathbb{R}## whose derivatives are 0 at 0 is a constant function, because its Taylor series at 0 has an infinite radius of convergence. So the equations hold in this case.

You are right, I misspoke. The function is everywhere infinitely differentiable, but obviously not analytic. The rest of the confusion in that post still holds.
 
Back
Top