I've had trouble with this for a while, and I asked my professor about it today but it still doesn't really make sense. Let's say an object moves with an initial velocity of 8 m/s for two seconds, ending with a final velocity of -8 m/s (with a constant acceleration). At 0 seconds, velocity is 8 m/s At 1 second, velocity is 0 m/s At 2 seconds, velocity is -8 m/s How do I correctly calculate the average speed and distance traveled? My professor told me that you can simply use the equation (Vf + Vo) / 2 to calculate it since it has a constant acceleration, and this comes out to be 8 m/s. This doesn't make sense to me, because it is only traveling at 8 m/s at the beginning instant and final instant of the motion. Average speed is the speed the object traveled on average during the motion, and I don't see how it would be an average of 8 if it was only 8 for two instances. It seems like it should be lower. Also, to find the distance traveled, you can simply multiply the average speed by the time interval. If I go with what my professor says the average speed is (8 m/s) then it would simply be 8 * 2 = 16m. However, if you split the problem into two parts and find the distance traveled from t0 to t1, and t1 to t2, you get average speeds of 4 m/s for each part which results in a combined distance of only 8m. The split method is the one I used to solve the problem because it makes more sense to me. So, am I right or is my professor right? If I'm wrong, please explain extremely clearly how I'm wrong because I'm really not getting it. I'd like to add another note down here that it seems illogical to me that an object that travels with a constant velocity of 8 m/s over two seconds will have an average speed of 8 m/s, and an object that begins at 8 m/s, slows down, then speeds up again to 8 m/s will also have an average speed of 8 m/s.