## Difference between calcultaed slope and slope given on the graph?

1. The problem statement, all variables and given/known data
When question is asked: "How close is the slope (from an X-t graph) you calculated and the value actually displayed on the graph?"---does this mean you simply subtract your calculated velocity(slope of X-t graph) from the slope actually given on the graph?

2. Relevant equations
your calculated slope is 0.2460 met./sec.
the slope given on the graph is 0.2487 met./sec
So, the difference between these should be 0.2487 - 0.2460 = 0.0027 units??

3. The attempt at a solution
i am unsure if i need to use the percent error formula or just simply subtract the two slopes to get the difference. actually, i am unsure of when you are supposed to use the percent error formula. Does the percent error formula have anything to do with this particular problem?

 PhysOrg.com science news on PhysOrg.com >> Heat-related deaths in Manhattan projected to rise>> Dire outlook despite global warming 'pause': study>> Sea level influenced tropical climate during the last ice age
 Recognitions: Homework Help I think a good approach would be to develop the error uncertainty from the calculated measurements and then determine if the actual slope lies within the range of acceptable values of the slope from the uncertainty. If it dosn't then maybe you want to check your error assumptions as regards your measurements.

 Tags difference slope