Feynman's Calculus


by murshid_islam
Tags: calculus, feynman
lurflurf
lurflurf is offline
#19
Jul7-05, 06:23 PM
HW Helper
P: 2,168
Quote Quote by quetzalcoatl9
not to nitpick, but shouldn't it be [tex]\frac{d}{dt}[/tex] ?
Yes it was a typo.
[tex]\frac{d}{dt}F(t)=F'(t)=-2F(t)[/tex]
while
[tex]\frac{d}{dx}F(t)=0[/tex]
saltydog
saltydog is offline
#20
Jul8-05, 07:11 AM
Sci Advisor
HW Helper
P: 1,593
Quote Quote by lurflurf
Here is an example
find [tex]\int_0^\infty\exp(-x^2-1/x^2) dx[/tex]
let [tex]F(t)= \int_0^\infty\exp(-x^2-t^2/x^2) dx[/tex]
now we want F(1)
F'(t)=-2F(t) so
You know what, I just don't see that. Can someone help me? When I take the derivative I get:

[tex]\frac{d}{dt}\int_0^{\infty}\exp(-x^2-t^2/x^2) dx=-2t\int_0^{\infty}\frac{1}{x^2}e^{-(x^2+\frac{t^2}{x^2})}dx[/tex]

and thus I don't see how the derivative is -2F(t).
lurflurf
lurflurf is offline
#21
Jul8-05, 07:33 AM
HW Helper
P: 2,168
Quote Quote by saltydog
You know what, I just don't see that. Can someone help me? When I take the derivative I get:

[tex]\frac{d}{dt}\int_0^{\infty}\exp(-x^2-t^2/x^2) dx=-2t\int_0^{\infty}\frac{1}{x^2}e^{-(x^2+\frac{t^2}{x^2})}dx[/tex]
and thus I don't see how the derivative is -2F(t).
That step (F'(t)=-2F(t)) was a little sudden so upon request I added a few intermidiate steps a few posts up. You can see the equality more easily if a substitution like u=t/x is made.
saltydog
saltydog is offline
#22
Jul8-05, 07:48 AM
Sci Advisor
HW Helper
P: 1,593
Quote Quote by lurflurf
First thing you have a sign error in your derivative +t^2/x^2 should have a minus sign.
That step (F'(t)=-2F(t)) was a little sudden so upon request I added a few intermidiate steps a few posts up. You can see that equality more easily if a substitution like u=t/x is made.
Very good Lurflurf. I see that now. Also, I took the minus sign out of parenthesis so I think I was Ok with that.
Castilla
Castilla is offline
#23
Jul10-05, 05:10 PM
P: 240
For the change of variable I am using these functions:
u = u(x) = t/x (t fixed)
<kjzsssss<<sp(u) = exp(-tZZzSpazoQqqqᴴAAʹSDS
and I obtained the result, but with 2 instead of -2.
Castilla
Castilla is offline
#24
Jul10-05, 05:11 PM
P: 240
Sorry for that post.

Castilla.
nanoWatt
nanoWatt is offline
#25
Jan8-08, 07:54 AM
P: 89
So if one integrates from -infinity to infinity, can you always change the range (or whatever it's called) of integration to 0 to infinity, and multiply the remaining integral by 2? Or is that a property of the sin that makes it symmetrical to where the negative side doesn't affect it? In other words, why doesn't a -2 come out?

Quote Quote by lurflurf View Post
If you mean using differentiation with respect to a parameter (ie under the integral sign it can be done line this.
[tex] \int_{-\infty} ^\infty \frac{Sin (x)}{x} dx =2\int_{0} ^\infty \frac{Sin (x)}{x} dx [/tex]
How does this exp(i z) work? Is it like i to the z power?

I may have seen some exp functions with three variables like exp (x y z) or so. Does that mean anything?

Quote Quote by lurflurf View Post
Take f=exp(i z)/z
Mute
Mute is offline
#26
Jan8-08, 01:14 PM
HW Helper
P: 1,391
Quote Quote by nanoWatt View Post
So if one integrates from -infinity to infinity, can you always change the range (or whatever it's called) of integration to 0 to infinity, and multiply the remaining integral by 2? Or is that a property of the sin that makes it symmetrical to where the negative side doesn't affect it? In other words, why doesn't a -2 come out?
The property you described depends on whether a function is even or odd. An even function is a function such that f(-x) = f(x), while an odd function is one such that g(-x) = -g(x).

So, for example, cosine is an even function, since cos(-x) = cos(x), while sine is an odd function, since sin(-x) = -sin(x). The exponential function is neither odd nor even, as e^(-x) does not equal either e^(x) (for any arbitrary value of x) or -e^(x).

Because an even function looks the same in the x > 0 half plane as it does in the x < 0 half plane, if you have symmetric limits about x = 0, then integrating from -L to L of an even function is just like integrating from 0 to L twice. This is only true for even functions. For odd functions, the contribution from the negative half plane will cancel out that from the positive half plane, so the result will be zero.

To summarize:

[tex]\int_{-L}^{L} dx f(x) = 2 \int_{0}^{L} dx f(x)~\mbox{if f(x) is even}[/tex]
[tex]\int_{-L}^{L} dx f(x) = 0~\mbox{if f(x) is odd}[/tex]



How does this exp(i z) work? Is it like i to the z power?

I may have seen some exp functions with three variables like exp (x y z) or so. Does that mean anything?

Have you ever heard of the imaginary unit i? It is the number defined such that [itex]i^2 = -1[/itex]. With this number one can define the complex exponential function, which has the property that

[tex]e^{ix} = \cos x + i \sin x[/tex]

Since exponentials typically have nicer properties than sines or cosines, by considering the integral of exp{ix}/x you might be able to get the integral for sin(x)/x in an easier fashion (and, you'll probably also get the integral of cos(x)/x out of it too, if that happens to be finite, which I don't think it is).
nanoWatt
nanoWatt is offline
#27
Jan8-08, 01:34 PM
P: 89
Thanks for the breakdown.

So how would you expand exp (i z)?
Is that the same as [tex] e^{iz} [/tex] ?
Mute
Mute is offline
#28
Jan8-08, 07:20 PM
HW Helper
P: 1,391
Quote Quote by nanoWatt View Post
Thanks for the breakdown.

So how would you expand exp (i z)?
Is that the same as [tex] e^{iz} [/tex] ?
exp{z} is just another notation for e^z.

If z is a complex number z = x + iy (where x and y are real), then

e^{z} = e^{x}e^{iy} = e^{x}(cos(y) + i sin(y))

And e^{i z} = e^{-y + ix} = e^{-y}(cos(x) + i sin(x))
swensonj
swensonj is offline
#29
Mar11-08, 10:14 PM
P: 1
Quote Quote by HallsofIvy View Post
Differentiating under the integral: Leibniz's rule- [tex]\frac{d}{dx}\int_{a(x)}^{b(x)} f(x,t)dt= \int_{a(x)}^{b(x)}\frac{\partial f(x,t)}{\partial x}dt+ \frac{da(x)}{dx}f(x,a(x))- \frac{db(x)}{dx}f(x,b(x))[/tex].
...
It seems to me that the signs on the last two terms of the right-hand side are reversed; should the formula be
[tex]\frac{d}{dx}\int_{a(x)}^{b(x)} f(x,t)dt= \left(\int_{a(x)}^{b(x)}\frac{\partial f(x,t)}{\partial x}dt\right)+ \frac{db(x)}{dx}f(x,b(x))- \frac{da(x)}{dx}f(x,a(x))[/tex]
?

I guess so, but I haven't figured out how to prove this formula yet....
Hurkyl
Hurkyl is offline
#30
Mar11-08, 10:39 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
Quote Quote by swensonj View Post
I guess so, but I haven't figured out how to prove this formula yet....
It's actually pretty straightforward; e.g. you might simply apply the limit formula for the derivative, and then splitting and rearranging the resulting expression into pieces that can be approximated well.

It's one of those things that, if you understand the ideas behind using limits and approximations, will be very straightforward. And if you don't find it straightforward, then it's really worth studying as an exercise.


Register to reply

Related Discussions
feynman's sum-over-histories Quantum Physics 15
Feynman's Plate Advanced Physics Homework 3
Feynman's Derivation Classical Physics 4
Help with Feynman's math! General Math 0