Throwing a ball sideways vs. dropping it

  • #1
So here's my physics conundrum:
I am on a ROUND Earth, in a vacuum. If I throw a ball really fast (say 99.9% of the orbital velocity for the ball's altitude), and drop another ball straight down, which takes longer to hit the ground?
My basic physics reasoning would say that it takes the same time, since from the thrown ball's reference frame, the distance to the ground is exactly the same whether it is thrown around the earth or dropped in one place. The thrown ball simply spirals around the earth to the ground, but effectively falls the same vertical distance as the dropped ball.
However, if you throw the ball fast enough, couldn't you get it into what is essentially a very slowly decaying orbit, thus making the thrown ball hit the ground after the dropped one?
How do I reconcile these two seemingly conflicting results, and what is the math behind it?
This is not homework or anything, just a thought I had today.
 

Answers and Replies

  • #2
Simon Bridge
Science Advisor
Homework Helper
17,857
1,655
The math in question is for the "central force problem".
Clearly if the ball tangential velocity were high enough, it would never hit the ground ... so the "equal times to fall to the ground" rule is only a rule of thumb and does not apply in all situations.
 
  • #3
A.T.
Science Advisor
11,263
2,655
My basic physics reasoning would say that it takes the same time
Applies only in a uniform field, which is just an approximation of the actual radial field for small areas.
 
  • #4
Andrew Mason
Science Advisor
Homework Helper
7,664
386
So here's my physics conundrum:
I am on a ROUND Earth, in a vacuum. If I throw a ball really fast (say 99.9% of the orbital velocity for the ball's altitude), and drop another ball straight down, which takes longer to hit the ground?
My basic physics reasoning would say that it takes the same time, since from the thrown ball's reference frame, the distance to the ground is exactly the same whether it is thrown around the earth or dropped in one place. The thrown ball simply spirals around the earth to the ground, but effectively falls the same vertical distance as the dropped ball.
However, if you throw the ball fast enough, couldn't you get it into what is essentially a very slowly decaying orbit, thus making the thrown ball hit the ground after the dropped one?
How do I reconcile these two seemingly conflicting results, and what is the math behind it?
This is not homework or anything, just a thought I had today.
Welcome to PF Romain!

If you throw the ball into orbit, it will never hit the ground. So, if you throw it at 99.9% of orbital speed, it will take a long time to hit the ground. But it will still fall at the same rate as a ball that is dropped. It is just that the earth surface keeps curving so it keeps having farther to "fall". On the earth surface at the equator, orbital speed is about 5 miles/second. A ball will drop 16 feet in one second whether it is dropped or thrown horizontally (assuming its passage through the air does not create lift). But every five miles the earth curves 16 feet. So while the 5 mile /sec thrown ball falls 16 feet in that first second, it then has another 16 feet to fall, etc.

AM
 
Last edited:
  • Like
Likes Romain Astie and Simon Bridge

Related Threads on Throwing a ball sideways vs. dropping it

Replies
11
Views
1K
Replies
14
Views
6K
  • Last Post
Replies
1
Views
541
  • Last Post
Replies
10
Views
3K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
11
Views
2K
  • Last Post
Replies
3
Views
5K
  • Last Post
Replies
10
Views
1K
  • Last Post
Replies
7
Views
1K
Top