Solving Uncertainty in Data Analysis with Spectrophotometry

  • Thread starter Thread starter LCSphysicist
  • Start date Start date
  • Tags Tags
    Data Uncertainty
AI Thread Summary
The discussion revolves around the challenge of estimating uncertainty in data obtained from spectrophotometry, where the user has a dataset of approximately 2,000 x and y values but lacks information on their uncertainties. The user considers adjusting uncertainty values to achieve a chi-squared to degrees of freedom ratio close to one, questioning the validity of this approach. They seek statistical methods to estimate uncertainty without initial data, emphasizing the need for compatible comparisons of fitted parameters with real values. The conversation highlights the importance of justifying any adjustments made to the model and the potential for using statistical techniques to estimate uncertainties based on the assumption of normally distributed data. Ultimately, the user is looking for a reliable method to derive uncertainty estimates to ensure valid statistical comparisons.
LCSphysicist
Messages
644
Reaction score
162
Homework Statement
.
Relevant Equations
.
So i have a folder with a lot of data/information. Basically what i have is approximatelly 2k 2upla of x and y, because i need to find the function that describe the behavior of these data. Of course, i can use a program/software to fix/adjust the curve using the concept of OLS... BTW.

The problem is that i don't have the uncertainty! Basically, this data was achieved using a spectrophotometry, but it is not given at the relatory which model and other things.

So, i was asking myself, instead of guess the uncertainty, is it allowed to change it seeking for the best agreement between the value of chi2 and NGL? I mean, since i don't know the uncertainty, i have the free to guess it using the chi²/ngl \approx 1 concept?

If not, is there a way to, at least statiscally, guess the uncertainty?
 
Physics news on Phys.org
Are you saying you are looking for a continuous function to fit the data without any theoretical basis for what the function should look like? There are techniques for that. They work by having to justify each each addition of an arbitrary parameter by beating a threshold for the improvement in the fit.

Or do you already know the form of the function and are just trying to estimate its parameters?
 
haruspex said:
Are you saying you are looking for a continuous function to fit the data without any theoretical basis for what the function should look like? There are techniques for that. They work by having to justify each each addition of an arbitrary parameter by beating a threshold for the improvement in the fit.

Or do you already know the form of the function and are just trying to estimate its parameters?
I already have the function, already have the data. The problem is the uncertainty. The datas given does not provide any information about it.

The software that fix the points will give me the numerical valor of the parameters of the function, that i have. BUT, i will need to compare these parameters with real values, and see if they are compatible.

The Problem IS, the uncertainty provided by the software for each parameter is non sense because i don't even have the uncertainty for the datas, x and y, initially given to me.

SO, in order to make a good comparation using test T or Z, i need at first be able to estimate the uncertainty of the points.

Now, since i have no information about it, i am thinking if there is a statiscally way to "estimate" the uncertainty for the datas.

The conclusion i got was to guess it until the ratio chi to Degree of Freedom be approximatelly one.

The other problem is, we use the chi to degree ratio as a way to see if the adjust is good, and what i am doing is like "to force" the adjust to be good.

So, i would like to know if there is another way to estimate uncertainty of a lot of numbers.

OR, if there isn't, i would like to know if what i did with chi to df is "allowed"?
 
Not sure it's valid, but if you suppose the data points are normally distributed about the curve, all with the same variance ##\sigma^2##, then the likelihood of the data is maximised by ##\sigma^2=\frac 1n\Sigma_{i=1}^n(y_i-y(x_i))^2##.
 
I multiplied the values first without the error limit. Got 19.38. rounded it off to 2 significant figures since the given data has 2 significant figures. So = 19. For error I used the above formula. It comes out about 1.48. Now my question is. Should I write the answer as 19±1.5 (rounding 1.48 to 2 significant figures) OR should I write it as 19±1. So in short, should the error have same number of significant figures as the mean value or should it have the same number of decimal places as...
Thread 'Collision of a bullet on a rod-string system: query'
In this question, I have a question. I am NOT trying to solve it, but it is just a conceptual question. Consider the point on the rod, which connects the string and the rod. My question: just before and after the collision, is ANGULAR momentum CONSERVED about this point? Lets call the point which connects the string and rod as P. Why am I asking this? : it is clear from the scenario that the point of concern, which connects the string and the rod, moves in a circular path due to the string...
Thread 'A cylinder connected to a hanging mass'
Let's declare that for the cylinder, mass = M = 10 kg Radius = R = 4 m For the wall and the floor, Friction coeff = ##\mu## = 0.5 For the hanging mass, mass = m = 11 kg First, we divide the force according to their respective plane (x and y thing, correct me if I'm wrong) and according to which, cylinder or the hanging mass, they're working on. Force on the hanging mass $$mg - T = ma$$ Force(Cylinder) on y $$N_f + f_w - Mg = 0$$ Force(Cylinder) on x $$T + f_f - N_w = Ma$$ There's also...
Back
Top