- #1

marcusesses

- 24

- 1

Say I perform an experiment, and I make a number measurements over a given interval (e.g t=0s to t = 10s, every 1s), and I perform this experiment many times.

Now, let's say I make a plot of data vs. time, and I want to find when the data peaks in time on average.

Which measurement of the peak time would provide more accuracy: if I take the average of the individual data points over the number of runs I did and and then determine the peak time, or if I determine the peak time within a given run and then find the average peak for all the data runs?

If anyone can nudge me in the right direction, it would be appreciated.

Thanks.

EDIT: I just read the sticky in this section...This isn't a homework question, I'm just wondering which would work better, but if it's more suited for the homework section, by all means move it there.