Ok let me just type the question and get to the real question I have. If you traveled one mile at a speed of 100 miles per hour and another mile at a speed of 1 mile per hour, your average speed would not be (100 mph+ 1 mph)/2 or 50.5 mph. What would be your average speed? (Hint:what is the total distance and total time) Ok so I worked it out, solved for the individual times and got 1/100 hr for one time and 1 hr for the second time. Then I added those times and divided by the total distance , 2 miles. And I got around 1.98 mph as the average speed My question. Why CANT do the whole (100+1)/2? Or rather why don't you get the correct answer? Doesn't it make sense if they're asking you for your average speed to average out the two average speeds they have you? And also, one of the speeds is 100 while the other is 1... So it makes no sense to me that the "final " average speed be 1.98? Can someone make sense of this for me? It just seems illogical .. Or maybe since it's just my first week in pre ap physics I'm just not seeing something critical ...?