DB
- 501
- 0
i'm new to physics and I am just trying to figure out the force (Newtons) it would take to throw a baseball 100 mph (not concerning the forces of gravity acting upon it just the basic force you would need to throw it at that speed). i have already figured out that 100 mph is equal to a velocity of 44.704 m/s, knowing that the distance from the pitchers mound to home plate is 18.4404 meters and that it would take 0.4125 seconds to reach home plate. and i know the mass of a baseball is about .145 kg. so I am trying to fill in Force = Mass x Acceleration. But i guess I am doing this wrong because after aplying the formula a = m/s2 (squared) i get a = 108.37333334 m/s/s.
I just don't understand how the acceration per second squared is larger than the velocity per second if acceleration is the rate of change of velocity. did i use the wrong formulas? please help, thanks
I just don't understand how the acceration per second squared is larger than the velocity per second if acceleration is the rate of change of velocity. did i use the wrong formulas? please help, thanks
Last edited: