1. The problem statement, all variables and given/known data An arrow is fired horizontally off a cliff. The initial speed of the arrow is 250 ft/s and the cliff is 22 ft high. How far from the base of the cliff does the arrow land? 2. Relevant equations y=1/2a(t^2) x=vt 3. The attempt at a solution Vertical: y=1/2 a(t^2) t=(√2y)/a t=2.12 s Horizontal: x=vt x=250(2.12) x=530 ft. Is this answer correct? Thanks in advance!