I have (what may be) a simple statistics question...(adsbygoogle = window.adsbygoogle || []).push({});

Say I perform an experiment, and I make a number measurements over a given interval (e.g t=0s to t = 10s, every 1s), and I perform this experiment many times.

Now, let's say I make a plot of data vs. time, and I want to find when the data peaks in time on average.

Which measurement of the peak time would provide more accuracy: if I take the average of the individual data points over the number of runs I did and and then determine the peak time, or if I determine the peak time within a given run and then find the average peak for all the data runs?

If anyone can nudge me in the right direction, it would be appreciated.

Thanks.

EDIT: I just read the sticky in this section...This isn't a homework question, I'm just wondering which would work better, but if it's more suited for the homework section, by all means move it there.

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Mean of data points or mean of peak

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**