Okay, this is not a homework problem, and I think if I really played with it long enough I could work something out. But really, I just want an answer, so I'm hoping someone on here who really loves solving problems like this can just knock it out. It would be much appreciated. I'm wondering about a sight bar setting for my bow (archery). Here's what I know. I set the sight in a certain position to shoot dead center at 9 meters. I set my sight precisely 1 cm lower to hit the center of the target at 18 meters. This raises the angle of the bow upon arrow release. The sight aperture (aim point) is 99.2 cm from my eye. On the two separate shots, the bow and arrows are the same, therefore same force, same initial velocity, same projectile weight. I don't know the initial velocity, or arrow weight. I would neglect drag, though I know that makes this slightly less accurate. I also recognize that's not a lot of information, but my gut tells me that knowing the difference between two separate sight settings while everything else remains constant is probably enough to solve it for someone who knows how. Now the question. How much would I have to lower the sight aperture (therefore raising the bow angle) to hit a target at 90 meters?