I have (what may be) a simple statistics question...(adsbygoogle = window.adsbygoogle || []).push({});

Say I perform an experiment, and I make a number measurements over a given interval (e.g t=0s to t = 10s, every 1s), and I perform this experiment many times.

Now, let's say I make a plot of data vs. time, and I want to find when the data peaks in time on average.

Which measurement of the peak time would provide more accuracy: if I take the average of the individual data points over the number of runs I did and and then determine the peak time, or if I determine the peak time within a given run and then find the average peak for all the data runs?

If anyone can nudge me in the right direction, it would be appreciated.

Thanks.

EDIT: I just read the sticky in this section...This isn't a homework question, I'm just wondering which would work better, but if it's more suited for the homework section, by all means move it there.

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Mean of data points or mean of peak

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

Loading...

Similar Threads - Mean data points | Date |
---|---|

I Gradient descent, hessian(E(W^T X)) = cov(X),Why mean=/=0?? | Apr 29, 2017 |

I Do I use instrument error or arithmetic mean error? | Apr 6, 2017 |

Distribution of Data | Nov 3, 2015 |

Finding the mean value from multiple abs+-error data | Oct 29, 2010 |

How to find Population mean when there are no data points ? | Apr 15, 2008 |

**Physics Forums - The Fusion of Science and Community**