If you guys can help me understand this, that would be very helpful. When you drop an object, say a ball from rest, the initial velocity is 0. What would it mean if the initial velocity wasn't 0? And why would that take longer to hit the ground?
Numbers:
---initial velocity set to 0---
Initial height 20.0m
Initial Velocity 0 m/s
Gravity 10.0m/s/s
t = 2.00 s
y = 0.00m
v =-20.00m/s
---initial velocity not 0---
Initial height 20.0m
Initial Velocity 2.0m/s
Gravity 10.0m/s/s
t = 2.21 s
y = 0.00m
v =-20.10m/sThe reason why I ask this question is because in lab we were doing a 1-Dimensional problem where we dropped a ruler and wanted to calculate the time it fell. I initially wanted to set the final velocity to 0(hit ground) and find the initial velocity with my givens(vf,a,change in y), but it turned out that by doing that I am changing the problem and therefore would get the wrong results.