- #1
bobloblaw
- 19
- 0
Hi Physics Forumers! I was listenining to the Feynman lectures and something Feynman said got me thinking. He was talking about the indeterminacy that exists in classical physics due to our uncertainty in the initial conditions:
I was wondering how he derived this. So I thought I would turn this into a challenge to the people on physics forums: Derive (or refute) what Feynman says in this quote! Make any assumptions you need to and use whatever level of physics you know. I'll post what I came up with after some people (hopefully) post their answer.
Speaking more precisely, given an arbitrary accuracy, no matter how precise, one can find a time long enough that we cannot make predictions valid for that long a time. Now the point is that this length of time is not very large. The time goes, in fact, logarithmically with the error, and it turns out that in only a very, very tiny time we lose all our information.
I was wondering how he derived this. So I thought I would turn this into a challenge to the people on physics forums: Derive (or refute) what Feynman says in this quote! Make any assumptions you need to and use whatever level of physics you know. I'll post what I came up with after some people (hopefully) post their answer.