Throwing a ball sideways vs. dropping it

  • Context: Undergrad 
  • Thread starter Thread starter Romain Astie
  • Start date Start date
  • Tags Tags
    Ball
Click For Summary

Discussion Overview

The discussion revolves around the physics of throwing a ball sideways at high speed compared to dropping a ball straight down, specifically in the context of a round Earth in a vacuum. Participants explore the implications of these actions on the time it takes for each ball to hit the ground, considering factors such as orbital mechanics and gravitational fields.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant posits that both balls take the same time to hit the ground, reasoning that from the thrown ball's reference frame, the vertical distance to the ground remains constant regardless of its horizontal motion.
  • Another participant introduces the concept of the "central force problem," suggesting that if the tangential velocity of the thrown ball is high enough, it may never hit the ground, indicating that the equal time rule is not universally applicable.
  • A different participant notes that the reasoning about equal fall times applies only in a uniform gravitational field, which is an approximation and does not hold in the actual radial field of the Earth.
  • One participant explains that if the ball is thrown into orbit, it will not hit the ground and will take a longer time to do so, while still falling at the same rate as the dropped ball due to the curvature of the Earth.
  • They provide an example that a ball thrown at 5 miles/second will fall 16 feet in one second, but the Earth curves away, requiring the ball to fall further over time.

Areas of Agreement / Disagreement

Participants express differing views on whether the time to hit the ground is the same for both scenarios. Some agree that the thrown ball's trajectory is influenced by its speed and the curvature of the Earth, while others challenge the assumption of equal fall times under varying conditions.

Contextual Notes

The discussion highlights the limitations of applying uniform gravitational field assumptions to a radial field and the complexities introduced by high-speed motion and orbital mechanics.

Romain Astie
Messages
1
Reaction score
0
So here's my physics conundrum:
I am on a ROUND Earth, in a vacuum. If I throw a ball really fast (say 99.9% of the orbital velocity for the ball's altitude), and drop another ball straight down, which takes longer to hit the ground?
My basic physics reasoning would say that it takes the same time, since from the thrown ball's reference frame, the distance to the ground is exactly the same whether it is thrown around the Earth or dropped in one place. The thrown ball simply spirals around the Earth to the ground, but effectively falls the same vertical distance as the dropped ball.
However, if you throw the ball fast enough, couldn't you get it into what is essentially a very slowly decaying orbit, thus making the thrown ball hit the ground after the dropped one?
How do I reconcile these two seemingly conflicting results, and what is the math behind it?
This is not homework or anything, just a thought I had today.
 
Physics news on Phys.org
The math in question is for the "central force problem".
Clearly if the ball tangential velocity were high enough, it would never hit the ground ... so the "equal times to fall to the ground" rule is only a rule of thumb and does not apply in all situations.
 
Romain Astie said:
My basic physics reasoning would say that it takes the same time
Applies only in a uniform field, which is just an approximation of the actual radial field for small areas.
 
Romain Astie said:
So here's my physics conundrum:
I am on a ROUND Earth, in a vacuum. If I throw a ball really fast (say 99.9% of the orbital velocity for the ball's altitude), and drop another ball straight down, which takes longer to hit the ground?
My basic physics reasoning would say that it takes the same time, since from the thrown ball's reference frame, the distance to the ground is exactly the same whether it is thrown around the Earth or dropped in one place. The thrown ball simply spirals around the Earth to the ground, but effectively falls the same vertical distance as the dropped ball.
However, if you throw the ball fast enough, couldn't you get it into what is essentially a very slowly decaying orbit, thus making the thrown ball hit the ground after the dropped one?
How do I reconcile these two seemingly conflicting results, and what is the math behind it?
This is not homework or anything, just a thought I had today.
Welcome to PF Romain!

If you throw the ball into orbit, it will never hit the ground. So, if you throw it at 99.9% of orbital speed, it will take a long time to hit the ground. But it will still fall at the same rate as a ball that is dropped. It is just that the Earth surface keeps curving so it keeps having farther to "fall". On the Earth surface at the equator, orbital speed is about 5 miles/second. A ball will drop 16 feet in one second whether it is dropped or thrown horizontally (assuming its passage through the air does not create lift). But every five miles the Earth curves 16 feet. So while the 5 mile /sec thrown ball falls 16 feet in that first second, it then has another 16 feet to fall, etc.

AM
 
Last edited:
  • Like
Likes   Reactions: Romain Astie and Simon Bridge

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 11 ·
Replies
11
Views
10K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
7
Views
3K
Replies
34
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K