# Bows and Arrows

An arrow while being shot from a bow was accelerated over a distance of 2.0 ft. If its speed at the moment it left the bow was 200 ft/sec what was the average acceleration imparted by the bow? Justify any assumptions you need to make.

Ok so I know $v_{0} = 200 \frac{ft}{sec}$. Also $t = \frac{1}{100}$ second. So would I use the equation $x = x_{0}+v_{x}_{0}t + \frac{1}{2}a_{x}t^{2}$? Or $2 = v_{x}_{0}t + \frac{1}{2}a_{x}t^{2}$ or $2 = 2 + \frac{1}{2}a_{x}(\frac{1}{100})^{2}$? I dont think this makes any sense. Maybe I need to make some assumptions?

Thanks

Last edited:

Indeed, $\overline a = \frac{v_{2}-v_{1}}{t_{2}-t_{1}} = \frac{\Delta v}{\Delta t}$. So I assume that the starting speed is 0. So we have $\frac{200}{\frac{1}{100}} = 20,000 \frac{ft}{sec^{2}}$. Is this correct?