1. The problem statement, all variables and given/known data At t=0 a ball, initially at rest, starts to roll down a ramp with constant acceleration. You notice it moves 1 foot between t=0 seconds and t = 1 second. How far does it move between t = 1 second and t = 2 seconds 2. Relevant equations delta x=vit+1/2at2? since it is constant speed wouldn't acceleration be 0? 3. The attempt at a solution it seems like it would be 1 foot right? = 1(1)+1/2(0)(1)2 delta x=1foot?