We are trying to measure the accleration due to gravity with a high speed camera. The set up is as follows. A 1/4 inch ball bearing is shot at 6 m/s across a half meter grid. The high speed camera is 1.8 meters away and perpendicular to the grid. The camera is records the ball at 250 frames per second. We then take the recording and analyze the ball positions on the computer. The computer gives an x and y coordinate for each frame. By knowing the time for each coordinate we should be able to measure the acceleration due to gravity. However, our measurements show gravity to be 9 m/s^2 systematically. Does anyone have a reason as to why this may occur?