A paradox inside Newtonian world

In summary: And then the system will start to move to the left.In summary, the center of mass does not move, even when masses are removed.
  • #176
Hurkyl said:
The first axiom is that there are only finitely many particles.

In theorem 3 (which deals with center of mass), they remark that the assumption of finitely many particles is essential to their formalism.

If we used their formalism, then there is no paradox: your construction is illegal.


Yes, but so is continuum mechanics then...

That's the "problem": we use Newtonian mechanics regularly with an infinite amount of "mass points".

Of course, Newtonian mechanics, limited to two mass points, and non-zero total angular momentum, is an entirely consistent axiomatic system. With one mass point also :tongue2:
 
Physics news on Phys.org
  • #177
vanesch said:
Yes, but so is continuum mechanics then...
Right; that's why that paper is an axiomatic foundation for particle mechanics, and not for continuum mechanics. :wink: I, actually, would really like to see a more general axiomatic foundation, it's just that this was all I could find.

That's the "problem": we use Newtonian mechanics regularly with an infinite amount of "mass points".
Actually, if you use the techniques of nonstandard analysis, then these axioms are adequate. (you only need hyperfinitely many particles to approximate your continuum)

e.g. In Tomaz's original scenario NSA tells us that a particle of infinitessimal mass gets flung rightwards at transfinite speed, and that exactly makes up for the missing momentum. (That's why I was making a big deal about the behavior about the origin, because that's my best guess as to the standard analog)
 
  • #178
Hurkyl said:
e.g. In Tomaz's original scenario NSA tells us that a particle of infinitessimal mass gets flung rightwards at transfinite speed, and that exactly makes up for the missing momentum. (That's why I was making a big deal about the behavior about the origin, because that's my best guess as to the standard analog)

Every ball (no matter how small) has a much bigger left drag, than the right drag. It quickly escapes from the gravity of its right neighbor.

No right moving whatsoever!
 
  • #179
Tomaz Kristan said:
Every ball (no matter how small) has a much bigger left drag, than the right drag. It quickly escapes from the gravity of its right neighbor.

No right moving whatsoever!
In the nonstandard model we'd use to analyze your scenario, there's a leftmost ball. :tongue:
 
  • #180
Hurkyl said:
In the nonstandard model we'd use to analyze your scenario, there's a leftmost ball. :tongue:

There is NO leftmost ball at all.
 
  • #181
Tomaz Kristan said:
There is NO leftmost ball at all.
But there is in the limiting case as the number of particles goes to infinity.
 
  • #182
Limit for what? For the "leftmost" ball speed after a second? For the force between the two "leftmost" balls?

Nothing like that exists.
 
  • #183
Tomaz Kristan said:
Limit for what? For the "leftmost" ball speed after a second? For the force between the two "leftmost" balls?

Nothing like that exists.
In the limit as the number of balls goes towards infinity. As we logically increase the number of balls, in each of our cases, there is a leftmost ball.
 
  • #184
ObsessiveMathsFreak said:
In the limit as the number of balls goes towards infinity. As we logically increase the number of balls, in each of our cases, there is a leftmost ball.

No, it doesn't go that way and you know that.
 
  • #185
There is NO leftmost ball at all.
There is, in the nonstanard model. It contains H balls, where H is a transfinite (hyper)integer. The H-th ball is the leftmost.
 
  • #186
This discussion seems to be an ideal example of an infinite process.
 
Last edited:
  • #187
What does transfinite mean? Boundless but not infinite?
 
  • #188
The standard natural numbers form an (external) subset of the hypernatural numbers. We (externally) define that a hypernatural number is transfinite if and only if it is larger than every natural number.

The word "transfinite" is used to distinguish it from the standard usage of "infinite", since, for example, a transfinite sum is something different than an infinite sum. (But their values are infinitessimally close, if the summand is well behaved)
 
  • #189
So basically transfinite numbers are numbers which are larger then any finite number but smaller then infinity?
 
  • #190
Gelsamel Epsilon said:
So basically transfinite numbers are numbers which are larger then any finite number but smaller then infinity?
That is an accurate statement.

(resisting urge to go into what is probably unnecessary detail)
 
  • #191
So, you say, that you are going to clean up the mess by adding some more infinite stuff?

Well, maybe, who knows, but currently those transfinite shadow balls are nowhere defined, inside the Newtonian world. It's still to be done, if it's of any use, anyway.

Now, it's no solution.
 
  • #192
Hurkyl said:
That is an accurate statement.

(resisting urge to go into what is probably unnecessary detail)

Go into more detail if you really want too, I'm always keen on learning things.
 
  • #193
Don't steal me my thread, please. Go elsewhere. Unless he somehow solve the paradox I gave, with those hypernaturals. Hypernumbers deserve a new topic. Here are welcome only iff something become clear using them.
 
  • #194
Tomaz Kristan said:
No, it doesn't go that way and you know that.
It does go that way. That's what an infinite sum means.

When we write [tex]\sum_{n=0}^{\infty} a_n[/tex], what we mean is;

[tex]\lim_{k \to \infty } \sum_{n=0}^{k} a_n[/tex]

It's clear as crystal. In each limiting case, there is a leftmost ball and the center of mass remains fixed. In the limit as [tex]k \to \infty[/tex], the accelleration of the center of mass is zero. And that is all we can say without progressing to very esoteric arguments about things like hyperreal numbers etc.
 
Last edited:
  • #195
You argument is false, OMF.

If it hadn't been, you could just as well proved, that the biggest natural number exists. Bigger than any other.

Well, it's a basic mistake on your side, trust me!
 
  • #196
ObsessiveMathsFreak said:
It does go that way. That's what an infinite sum means.

When we write [tex]\sum_{n=0}^{\infty} a_n[/tex], what we mean is;

[tex]\lim_{k \to \infty } \sum_{n=0}^{k} a_n[/tex]

It's clear as crystal. In each limiting case, there is a leftmost ball and the center of mass remains fixed. In the limit as [tex]k \to \infty[/tex], the accelleration of the center of mass is zero.

Yes, this is correct. It is because you approach the final situation as a sequence of situations in which there are each time a finite number of balls, but more and more of them. However, there's no guarantee that this "adding balls" is the one and only correct "limiting procedure".

The other approach, the one that leads to the paradox, is, by NOT setting up a sequence of situations with more and more balls, but by considering all balls at once, and calculate the total force on each individual ball. If you do that, it turns out that the sign of the force on each individual ball, by the entire set of all balls, is the same.
Adding now the forces of all balls together will then result of course in a net force with the same sign.

There are similar situations where you cannot just consider sequences of physical setups, and take the limit of a quantity in this sequence, as the value of the quantity that would occur in the limiting situation. Another example is this:

Consider an Euclidean space with a homogeneous, constant mass density. Turns out (by symmetry) that this mass density doesn't result in any (Newtonian) gravitation force on a test mass.
However, if you approach this situation by considering a sphere of radius R with homogeneous mass density, and 0 outside, then your test mass will undergo, for each value of R, a specific force towards the center of the sphere. If R > d (distance between test particle and center of sphere), then this force will not change anymore. So the force, as a function of R, grows first, and becomes a constant from the moment R > d. Taking the limit R - > infinity gives you this constant force.

Nevertheless, the physical situation with R-> infinity is a space filled with a homogeneous mass density, where the force should be 0.
 
  • #197
Tomaz Kristan said:
You argument is false, OMF.

If it hadn't been, you could just as well proved, that the biggest natural number exists. Bigger than any other.

Well, it's a basic mistake on your side, trust me!

I assure you, an infinite number of non zero numbers sums to infinity. When we speak of "infinte" sums, we are in fact speaking about the limiting case as the number of terms in the sum increases without bound. That's what a "sum to infinity" really means. The limiting case.
 
  • #198
> I assure you, an infinite number of non zero numbers sums to infinity.

1/2+1/4+1/8+ ... = 1

Don't you think so?
 
  • #199
Important point that's probably already been made: the sum of an infinite sequence is equal to the limit of the sum of its first n terms as n goes to infinity. We need this definition to avoid problems such as:

This sequence converges
[tex]
\sum_{n=1}^\infty \frac{(-1)^n}{n}
[/tex]

However, because 1/n diverges as we sum to infinity, we can just rearrange the terms so that we get a series of the form

1/2 + 1/4 + 1/6 + 1/8 ... (add until we reach at least 3) - 1/1 + ... (add until we reach at least 6) - 1/3 + ...
As you can see, this series diverges to infinity, even though it as the exact same elements as our alternating converging series. The order we sum the elements in makes a HUGE difference!

Therefore, to calculate the value of your system, you HAVE to take the limit as n approaches infinity, which means you always consider the left most (nth) ball, your forces cancel, and the problem disappears.
 
Last edited:
  • #200
Alkatran said:
Therefore, to calculate the value of your system, you HAVE to take the limit as n approaches infinity, which means you always consider the left most (nth) ball, your forces cancel, and the problem disappears.

As I tried to point out, that's only one possible way to "approach the system": considering systems with n balls, calculating the total force on each ball in this n-ball system, sum the forces for the force on the CoG, and find 0 for each n. Then take n-> infinity for {0,0,0,...0,...} which gives 0.

But another possible way gives you the paradox. Instead of taking the sequence of systems with n balls with n -> infinity, we can also consider directly the system of infinite number of balls, but this time, calculate the total force on ball number k, F_k. F_k can be shown to be of same sign, no matter what k. We then consider the sum over the first n of these F_k, and take now the limit n -> infinity, to find the total force. And this time, the sum doesn't go to 0.

As you point out, this comes because we are re-arranging a conditionally-convergent series (but we already talked about that in the beginning of this thread). So yes, we do understand mathematically, how it comes that we get different results according to different approaches. However, Newton's axioms don't tell us which is the "right" approach.

One could think that "adding balls one by one" is the obviously correct way, because that's what makes the CoG forceless.

However, I gave a counter example (the space filled with homogeneous mass density) where it is this time this technique that fails (the growing sphere with R - > infinity doesn't give you the right force on a test particle), and where it is the "sum over forces" technique that gives you the right result.

So this means that, from case to case, in a Newtonian system, there are different approaches possible which give different answers, and it is not a priori clear which is the "right" answer. That's what is called an inconsistent axiomatic system.
(and again, as long as there is a finite number of balls, there's no problem, but we often use Newtonian mechanics outside of this scope).
 
  • #201
That's what is called an inconsistent axiomatic system.
Except people haven't even been working with an axiomatic system; they've only been working with an informally defined system. :frown: (But the idea is similar)

When working in an informally defined system, paradoxes aren't unexpected; they arise when you've implicitly made conflicting assumptions.

Of course, the usual procedure when that happens is to figure out what the conflicting assumptions are, so you can fix things, rather than broadcast to the world "Hey, look at me, I've made conflicting assumptions".
 
  • #202
Alkatran said:
Important point that's probably already been made: the sum of an infinite sequence is equal to the limit of the sum of its first n terms as n goes to infinity. We need this definition to avoid problems such as:

This sequence converges
[tex]
\sum_n=1^\infty \frac{(-1)^n}{n}
[/tex]

However, because 1/n diverges as we sum to infinity, we can just rearrange the terms so that we get a series of the form

1/2 + 1/4 + 1/6 + 1/8 ... (add until we reach at least 3) - 1/1 + ... (add until we reach at least 6) - 1/3 + ...
As you can see, this series diverges to infinity, even though it as the exact same elements as our alternating converging series. The order we sum the elements in makes a HUGE difference!

Therefore, to calculate the value of your system, you HAVE to take the limit as n approaches infinity, which means you always consider the left most (nth) ball, your forces cancel, and the problem disappears.

Stuff like this is where concepts of uniform convergence come into play. your alternating series isn't uniformly convergent, so you can't just rearrange subseries of it and have them sum properly. Pick up any decent book on real analysis and it talks about this.
 
  • #203
Haha, it seems my post had a small type where 1^infinity showed up in the equation (why isn't latex previewed in the previewed post?). It should have been n = 1 to infinity.
 
  • #204
Hurkyl said:
Of course, the usual procedure when that happens is to figure out what the conflicting assumptions are, so you can fix things, rather than broadcast to the world "Hey, look at me, I've made conflicting assumptions".

Ah ? I thought it was more custom to broadcast "hey, look at me" :tongue2:
 
  • #205
OK. So nothing is wrong with this paradox, after all. It stands.

Where do you suggest me, to publish it? :cool:
 
  • #206
Tomaz Kristan said:
OK. So nothing is wrong with this paradox, after all. It stands.

Where do you suggest me, to publish it? :cool:

After having been told precisely what is wrong with it, you say this? Lovely.
 
  • #207
HallsofIvy said:
After having been told precisely what is wrong with it, you say this? Lovely.

Well, *I* don't see what's wrong with it.
Of course we know where the trouble comes from, mathematically: it is a conditionally convergent series. But it is not because we know where it goes wrong, that it doesn't go wrong.

We also know a simple way to avoid it: allow only for a finite number of point particles.

But if we go beyond this, and we

1) allow for a distribution of a finite amount of mass in a countably infinite number of pieces (like we do in continuum mechanics), then I see only one extra way to avoid the problem, which is:

2) don't allow for a divergent mass density

However, we now have a much more severe problem:
can we show that any distribution initially satisfying (1) and (2) will always satisfy (1) and (2) ? If not, what extra conditions should we add so that we obtain a conserved set of conditions that avoids the appearance of a conditionally convergent series ? And is the resulting permitted set of states not so severely crippled that it also prohibits us from using it practically ?

Now, of course this hasn't any implication on the practical use of Newtonian mechanics, and it surely is not a formal system. But it means that it will not be easy to turn it into a formal system that incorporates all the practical uses we make of it. So I would consider it as an interesting curiosity in Newtonian physics.
 
  • #208
Tomaz Kristan said:
OK. So nothing is wrong with this paradox, after all. It stands.

Where do you suggest me, to publish it? :cool:

I hate it when people read the responses but don't pay attention to them. This has very little to do with Newtonian mechanics. It has everything to do with simple first year calculus.
 
  • #209
Alkatran said:
I hate it when people read the responses but don't pay attention to them. This has very little to do with Newtonian mechanics. It has everything to do with simple first year calculus.

You mean: it is a conditionally convergent series of which the sum can be altered by altering the order in which you sum them ?
Sure. That's what's goes wrong: that we obtain a conditionally convergent series, and that the order has not been specified. In other words, the theory that spit this out (Newtonian mechanics) by applying its rules ("sum over all forces") bungled up. That's all that is being claimed here.
 
  • #210
I`ll just repeat what I said before. It's a nice curiosity but doesn't contain any physics. Newton's laws don't say anything about the existence of point particles or when their use is valid. You have to apply that reasoning yourself depending on the problem.
Newton thought everything was made of particles, but said nothing about the nature of these particles (like size or inner workings) or how many there are. It's physically clear you can't get an infinite number of them next to each other.
Continuum mechanics is just an approximation, a method for dealing with a large number of particles. Please don't mix up the mathematics with the physics.
 

Similar threads

Replies
1
Views
804
  • Special and General Relativity
Replies
9
Views
305
Replies
2
Views
686
Replies
4
Views
6K
  • Special and General Relativity
Replies
11
Views
1K
  • Special and General Relativity
Replies
2
Views
817
Replies
8
Views
2K
  • Special and General Relativity
Replies
13
Views
1K
  • Special and General Relativity
Replies
24
Views
2K
Replies
3
Views
930
Back
Top