# How does a computer calculate acceleration and velocity?

I am doing another lab report on motion. For this lab we used a motion detector connected to a computer to collect and display data about an object in motion. The question that I am having trouble with is how does the computer calculate velocity and acceleration based on the position data? (BTW the position is taken every 0.05 sec.) My initial thought was that the computer was finding the averages of veloicty and acceleration by using displacement/time and change in velocity/time. My second thought was that the computer used the best linear fit and took the slope of that line. I've tried both of these methods and neither matches up with the data. Any ideas?

Thanks,
hk

## Answers and Replies

Related Introductory Physics Homework Help News on Phys.org
Do you know the motion detector make/model? What does it use for motion sensing?

I don't know the make or model, I do know it uses sound waves for motion sensing.

hk

Im not sure about your motion detector, but speed cameras use the Doppler effect.

Google it for more info, but its what causes a car to sound different as it approaches you.

can you give some sample data?

gnpatterson said:
can you give some sample data?
Time--Position--Velocity--Acceleration
-(s)-----(m)-----(m/s)-----(m/s^2)

1.50___0.300____0.236______0.136
1.55___0.311____0.242______0.132
1.60___0.323____0.254______0.086
1.65___0.338____0.257_____-0.005
1.70___0.350____0.250_____-0.043

hk

HallsofIvy