I am doing another lab report on motion. For this lab we used a motion detector connected to a computer to collect and display data about an object in motion. The question that I am having trouble with is how does the computer calculate velocity and acceleration based on the position data? (BTW the position is taken every 0.05 sec.) My initial thought was that the computer was finding the averages of veloicty and acceleration by using displacement/time and change in velocity/time. My second thought was that the computer used the best linear fit and took the slope of that line. I've tried both of these methods and neither matches up with the data. Any ideas?