Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Mean of data points or mean of peak

  1. Feb 12, 2009 #1
    I have (what may be) a simple statistics question...

    Say I perform an experiment, and I make a number measurements over a given interval (e.g t=0s to t = 10s, every 1s), and I perform this experiment many times.

    Now, let's say I make a plot of data vs. time, and I want to find when the data peaks in time on average.

    Which measurement of the peak time would provide more accuracy: if I take the average of the individual data points over the number of runs I did and and then determine the peak time, or if I determine the peak time within a given run and then find the average peak for all the data runs?

    If anyone can nudge me in the right direction, it would be appreciated.

    Thanks.

    EDIT: I just read the sticky in this section...This isn't a homework question, I'm just wondering which would work better, but if it's more suited for the homework section, by all means move it there.
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted



Similar Discussions: Mean of data points or mean of peak
  1. What does 'mean' mean? (Replies: 6)

  2. Conditional mean (Replies: 2)

Loading...