Undergrad How do I normalise my data to a maximum of 100?

  • Thread starter Thread starter says
  • Start date Start date
  • Tags Tags
    Data Maximum
Click For Summary
SUMMARY

The discussion centers on normalizing a dataset of counts measured at various angles to a maximum of 100. The user has calculated the mean and standard deviation for the counts but is uncertain about the normalization process. Key insights include the importance of avoiding the division of all observations by a single value, as this can distort the dataset, and the necessity of understanding the underlying model, such as a Gaussian curve, for accurate normalization. The conversation highlights the need for clarity on measurement periods and the relationship between counts per second and counts over a specified duration.

PREREQUISITES
  • Understanding of Gaussian curve fitting
  • Knowledge of statistical measures such as mean and standard deviation
  • Familiarity with data normalization techniques
  • Basic concepts of error analysis in measurements
NEXT STEPS
  • Learn about Gaussian curve fitting techniques in data analysis
  • Research normalization methods for datasets with varying measurement periods
  • Explore error analysis and its impact on data interpretation
  • Study the implications of using different statistical measures for data representation
USEFUL FOR

Data analysts, researchers in experimental physics, and anyone involved in statistical data normalization and analysis will benefit from this discussion.

says
Messages
585
Reaction score
12
I have this set of data (in the attached image) and I'm trying to normalize the counts at each angle to a maximum of 100. I'm not sure how to do it though. Any tips would be much appreciated.

I have the mean and std dev for each count at each angle. I don't really know what to do from here though...
 

Attachments

  • data.jpg
    data.jpg
    23.9 KB · Views: 491
Physics news on Phys.org
You should have a very good reason to divide all observations by 1451 (which does what you say you want to do). Such an action mixes up all 33 observations with one single result (that has an error of itself and on top of that appears to deviate from the pattern -- an outlier candidate !).

upload_2017-5-7_11-13-43.png


Could you tell us a little more ?
 
The data represents the number of counts at each angle. So at 175 degrees we measured counts of 1314, 1368, and 1420.

I've taken the mean of the counts and then plotted the angle vs. mean counts. I want to normalize this to a maximum of 100 though
 

Attachments

  • graph.jpg
    graph.jpg
    13.6 KB · Views: 456
Aha, there's a bit more data and what you are trying to do is fit a peak in some spectrum ?
Could you explain why you need the 100 maximum ?
More questions: how accurate is your angle determination ?

By averaging you effectively add up the individual measurements. Left figure is average with error bars from (only) three measurements. Right figure is sum with error bars from Poisson statistics (##\sigma=\sqrt(N)##).
upload_2017-5-7_11-45-34.png
upload_2017-5-7_11-45-57.png


The red error bars are not very meaningful: the relative error in a standard deviation from N=3 is huge (theoretically ##1/\sqrt{N-1\ }## but that is for N >> 1).
The blue error bars are simply ##\sqrt N## where now N is the number of counts, so all are 64. A linear relationship easily catches 10 out of 11 measurements -- but that was before you showed a wider range of angles. So: forget the linear trend line (you 'wrong-footed' us by only showing a small part of the data :wink:).

Now a core question: do you have a 'model' line shape ? A theoretical expectation like a gaussian or a lorentzian ?
 

Attachments

  • upload_2017-5-7_11-45-44.png
    upload_2017-5-7_11-45-44.png
    4 KB · Views: 480
Sorry, I didn't realize I'd cropped my image of the data incorrectly. I've attached the rest of it to the image below. The data fits a gaussian curve.

One of my count rates was for 10 seconds. The other was measured in counts per second. If I can normalise both to a maximum of 100 i can compare and contrast both results.
 

Attachments

  • data.jpg
    data.jpg
    37.1 KB · Views: 462
says said:
The data fits a gaussian curve
Don't forget the background ... So you have four parameters to adjust: ##\bar x, \ \sigma,\ ## N and C (peak height and background).
says said:
One of my count rates was for 10 seconds. The other was measured in counts per second
Counts1, 2 and 3 seem to be alike; are they the 10 second period counts ? Counts per second was measured how, precisely (if also over 10 seconds, no problem :smile:)
 
Yes, counts 1,2, and 3 were all the 10 second period counts.

The counts per second data is not present. I was just hoping to normalize the data I have and then do it with the other data.
 
Don't understand. Counts per second should be counts per 10 seconds divided by 10, so doesn't that make them comparable?

Otherwise, you are simply left with an extra unknown parameter: some normalization ratio
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
28
Views
4K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 10 ·
Replies
10
Views
10K
  • · Replies 37 ·
2
Replies
37
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K