Throwing a ball sideways vs. dropping it

  • Context: Undergrad 
  • Thread starter Thread starter Romain Astie
  • Start date Start date
  • Tags Tags
    Ball
Click For Summary
SUMMARY

In a vacuum on a round Earth, throwing a ball at 99.9% of its orbital velocity results in a different trajectory compared to dropping a ball straight down. While both balls fall at the same rate due to gravity, the thrown ball follows a curved path around the Earth, effectively increasing the distance it must travel to reach the ground. The concept of a "very slowly decaying orbit" indicates that if thrown fast enough, the ball may never hit the ground, as it continuously falls while moving forward. This scenario illustrates the complexities of the central force problem in physics.

PREREQUISITES
  • Understanding of gravitational acceleration and free fall
  • Knowledge of orbital mechanics and tangential velocity
  • Familiarity with the central force problem in physics
  • Basic concepts of projectile motion
NEXT STEPS
  • Study the principles of gravitational acceleration and how they apply to different trajectories
  • Explore the mathematics of orbital mechanics, focusing on velocity and curvature
  • Investigate the central force problem and its implications in physics
  • Learn about projectile motion and its equations in a vacuum environment
USEFUL FOR

Physics students, educators, and anyone interested in understanding the dynamics of motion in gravitational fields and orbital mechanics.

Romain Astie
Messages
1
Reaction score
0
So here's my physics conundrum:
I am on a ROUND Earth, in a vacuum. If I throw a ball really fast (say 99.9% of the orbital velocity for the ball's altitude), and drop another ball straight down, which takes longer to hit the ground?
My basic physics reasoning would say that it takes the same time, since from the thrown ball's reference frame, the distance to the ground is exactly the same whether it is thrown around the Earth or dropped in one place. The thrown ball simply spirals around the Earth to the ground, but effectively falls the same vertical distance as the dropped ball.
However, if you throw the ball fast enough, couldn't you get it into what is essentially a very slowly decaying orbit, thus making the thrown ball hit the ground after the dropped one?
How do I reconcile these two seemingly conflicting results, and what is the math behind it?
This is not homework or anything, just a thought I had today.
 
Physics news on Phys.org
The math in question is for the "central force problem".
Clearly if the ball tangential velocity were high enough, it would never hit the ground ... so the "equal times to fall to the ground" rule is only a rule of thumb and does not apply in all situations.
 
Romain Astie said:
My basic physics reasoning would say that it takes the same time
Applies only in a uniform field, which is just an approximation of the actual radial field for small areas.
 
Romain Astie said:
So here's my physics conundrum:
I am on a ROUND Earth, in a vacuum. If I throw a ball really fast (say 99.9% of the orbital velocity for the ball's altitude), and drop another ball straight down, which takes longer to hit the ground?
My basic physics reasoning would say that it takes the same time, since from the thrown ball's reference frame, the distance to the ground is exactly the same whether it is thrown around the Earth or dropped in one place. The thrown ball simply spirals around the Earth to the ground, but effectively falls the same vertical distance as the dropped ball.
However, if you throw the ball fast enough, couldn't you get it into what is essentially a very slowly decaying orbit, thus making the thrown ball hit the ground after the dropped one?
How do I reconcile these two seemingly conflicting results, and what is the math behind it?
This is not homework or anything, just a thought I had today.
Welcome to PF Romain!

If you throw the ball into orbit, it will never hit the ground. So, if you throw it at 99.9% of orbital speed, it will take a long time to hit the ground. But it will still fall at the same rate as a ball that is dropped. It is just that the Earth surface keeps curving so it keeps having farther to "fall". On the Earth surface at the equator, orbital speed is about 5 miles/second. A ball will drop 16 feet in one second whether it is dropped or thrown horizontally (assuming its passage through the air does not create lift). But every five miles the Earth curves 16 feet. So while the 5 mile /sec thrown ball falls 16 feet in that first second, it then has another 16 feet to fall, etc.

AM
 
Last edited:
  • Like
Likes   Reactions: Romain Astie and Simon Bridge

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 11 ·
Replies
11
Views
10K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
7
Views
3K
Replies
34
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K