I How do I normalise my data to a maximum of 100?

  • I
  • Thread starter Thread starter says
  • Start date Start date
  • Tags Tags
    Data Maximum
AI Thread Summary
To normalize data counts at each angle to a maximum of 100, it's essential first to understand the data's underlying distribution, which in this case fits a Gaussian curve. The discussion highlights the importance of accurately determining the angle and the measurement duration, as discrepancies can affect normalization. The user aims to compare two datasets measured over different timeframes, which requires ensuring they are comparable by adjusting the counts per second accordingly. Error bars and statistical significance are also crucial, as they can influence the interpretation of the results. Overall, careful consideration of the normalization method and the data's characteristics is necessary for accurate analysis.
says
Messages
585
Reaction score
12
I have this set of data (in the attached image) and I'm trying to normalize the counts at each angle to a maximum of 100. I'm not sure how to do it though. Any tips would be much appreciated.

I have the mean and std dev for each count at each angle. I don't really know what to do from here though...
 

Attachments

  • data.jpg
    data.jpg
    23.9 KB · Views: 485
Physics news on Phys.org
You should have a very good reason to divide all observations by 1451 (which does what you say you want to do). Such an action mixes up all 33 observations with one single result (that has an error of itself and on top of that appears to deviate from the pattern -- an outlier candidate !).

upload_2017-5-7_11-13-43.png


Could you tell us a little more ?
 
The data represents the number of counts at each angle. So at 175 degrees we measured counts of 1314, 1368, and 1420.

I've taken the mean of the counts and then plotted the angle vs. mean counts. I want to normalize this to a maximum of 100 though
 

Attachments

  • graph.jpg
    graph.jpg
    13.6 KB · Views: 451
Aha, there's a bit more data and what you are trying to do is fit a peak in some spectrum ?
Could you explain why you need the 100 maximum ?
More questions: how accurate is your angle determination ?

By averaging you effectively add up the individual measurements. Left figure is average with error bars from (only) three measurements. Right figure is sum with error bars from Poisson statistics (##\sigma=\sqrt(N)##).
upload_2017-5-7_11-45-34.png
upload_2017-5-7_11-45-57.png


The red error bars are not very meaningful: the relative error in a standard deviation from N=3 is huge (theoretically ##1/\sqrt{N-1\ }## but that is for N >> 1).
The blue error bars are simply ##\sqrt N## where now N is the number of counts, so all are 64. A linear relationship easily catches 10 out of 11 measurements -- but that was before you showed a wider range of angles. So: forget the linear trend line (you 'wrong-footed' us by only showing a small part of the data :wink:).

Now a core question: do you have a 'model' line shape ? A theoretical expectation like a gaussian or a lorentzian ?
 

Attachments

  • upload_2017-5-7_11-45-44.png
    upload_2017-5-7_11-45-44.png
    4 KB · Views: 472
Sorry, I didn't realize I'd cropped my image of the data incorrectly. I've attached the rest of it to the image below. The data fits a gaussian curve.

One of my count rates was for 10 seconds. The other was measured in counts per second. If I can normalise both to a maximum of 100 i can compare and contrast both results.
 

Attachments

  • data.jpg
    data.jpg
    37.1 KB · Views: 456
says said:
The data fits a gaussian curve
Don't forget the background ... So you have four parameters to adjust: ##\bar x, \ \sigma,\ ## N and C (peak height and background).
says said:
One of my count rates was for 10 seconds. The other was measured in counts per second
Counts1, 2 and 3 seem to be alike; are they the 10 second period counts ? Counts per second was measured how, precisely (if also over 10 seconds, no problem :smile:)
 
Yes, counts 1,2, and 3 were all the 10 second period counts.

The counts per second data is not present. I was just hoping to normalize the data I have and then do it with the other data.
 
Don't understand. Counts per second should be counts per 10 seconds divided by 10, so doesn't that make them comparable?

Otherwise, you are simply left with an extra unknown parameter: some normalization ratio
 
Back
Top