Solving Uncertainty in Data Analysis with Spectrophotometry

  • Thread starter Thread starter LCSphysicist
  • Start date Start date
  • Tags Tags
    Data Uncertainty
Click For Summary

Homework Help Overview

The discussion revolves around the challenges of estimating uncertainty in data analysis using spectrophotometry. The original poster is attempting to fit a function to a dataset of approximately 2000 pairs of x and y values but lacks information on the uncertainties associated with these data points.

Discussion Character

  • Exploratory, Assumption checking, Problem interpretation

Approaches and Questions Raised

  • The original poster considers whether it is permissible to adjust the uncertainty values to achieve a good fit based on the chi-squared to degrees of freedom ratio. They inquire about statistical methods for estimating uncertainty in the absence of provided data.
  • Some participants question whether the original poster is seeking a function without theoretical justification or if they already have a function and are merely estimating its parameters.
  • There is mention of techniques for fitting data that require justification for adding parameters, suggesting a need for a theoretical basis.
  • One participant proposes a method for estimating variance based on the assumption of normally distributed data points.

Discussion Status

The discussion is ongoing, with participants exploring various interpretations of the problem and the validity of the original poster's approach to estimating uncertainty. Some guidance has been offered regarding statistical methods, but no consensus has been reached on the appropriateness of the proposed techniques.

Contextual Notes

The original poster is constrained by the lack of uncertainty information in their dataset, which complicates their ability to perform statistical comparisons. They are seeking alternative methods to estimate uncertainty without clear guidelines from the data provided.

LCSphysicist
Messages
644
Reaction score
163
Homework Statement
.
Relevant Equations
.
So i have a folder with a lot of data/information. Basically what i have is approximatelly 2k 2upla of x and y, because i need to find the function that describe the behavior of these data. Of course, i can use a program/software to fix/adjust the curve using the concept of OLS... BTW.

The problem is that i don't have the uncertainty! Basically, this data was achieved using a spectrophotometry, but it is not given at the relatory which model and other things.

So, i was asking myself, instead of guess the uncertainty, is it allowed to change it seeking for the best agreement between the value of chi2 and NGL? I mean, since i don't know the uncertainty, i have the free to guess it using the chi²/ngl \approx 1 concept?

If not, is there a way to, at least statiscally, guess the uncertainty?
 
Physics news on Phys.org
Are you saying you are looking for a continuous function to fit the data without any theoretical basis for what the function should look like? There are techniques for that. They work by having to justify each each addition of an arbitrary parameter by beating a threshold for the improvement in the fit.

Or do you already know the form of the function and are just trying to estimate its parameters?
 
haruspex said:
Are you saying you are looking for a continuous function to fit the data without any theoretical basis for what the function should look like? There are techniques for that. They work by having to justify each each addition of an arbitrary parameter by beating a threshold for the improvement in the fit.

Or do you already know the form of the function and are just trying to estimate its parameters?
I already have the function, already have the data. The problem is the uncertainty. The datas given does not provide any information about it.

The software that fix the points will give me the numerical valor of the parameters of the function, that i have. BUT, i will need to compare these parameters with real values, and see if they are compatible.

The Problem IS, the uncertainty provided by the software for each parameter is non sense because i don't even have the uncertainty for the datas, x and y, initially given to me.

SO, in order to make a good comparation using test T or Z, i need at first be able to estimate the uncertainty of the points.

Now, since i have no information about it, i am thinking if there is a statiscally way to "estimate" the uncertainty for the datas.

The conclusion i got was to guess it until the ratio chi to Degree of Freedom be approximatelly one.

The other problem is, we use the chi to degree ratio as a way to see if the adjust is good, and what i am doing is like "to force" the adjust to be good.

So, i would like to know if there is another way to estimate uncertainty of a lot of numbers.

OR, if there isn't, i would like to know if what i did with chi to df is "allowed"?
 
Not sure it's valid, but if you suppose the data points are normally distributed about the curve, all with the same variance ##\sigma^2##, then the likelihood of the data is maximised by ##\sigma^2=\frac 1n\Sigma_{i=1}^n(y_i-y(x_i))^2##.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
4K
Replies
7
Views
2K
Replies
1
Views
1K
Replies
1
Views
3K
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
24
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K