1. The problem statement, all variables and given/known data I have a projectile that shoots straight up into the air (parallel to the y-axis). I need to calculate the initial velocity needed to reach a specific height. I am given the following: - the y-displacement (s1) to reach - the time (t) at which the projectile reaches the specified height - the acceleration due to gravity (a = -9.8 m/s/s) Additionally, after time (t), velocity should be zero, and the projectile should be at the specified height. In other words, it takes (t) seconds to reach the "apex." 2. Relevant equations Nothing given, but I suspect: s1 = s0 + (v0 * t) + (0.5 * a * t * t) 3. The attempt at a solution Let's say: - s0 = 0 meters - s1 = 1 meters - t = 0.25 secs - a = -9.8 m/s/s - v0 = ? I thought I'd solve for v0 in the equation above. That gives: v0 = s1 / t - (0.5 * a * t) v0 = 1 / 0.25 - (0.5 * -9.8 * 0.25) v0 = 4 - (-1.225) => v0 = 5.1225 That seems valid, but now let's say I need to update the velocity 30 times per second. This means that I am starting with an initial velocity of 5.1225 m/s, and each "tick" I need to decrement the velocity by some amount. After 0.25 seconds my velocity should tick down to zero. I am guessing I would use (v1 = v0 + a * t) per "tick" for this, correct? Where v0 is always 5.1225 and (t) increments by (1/30) in each tick?