Optimal Approach for Analyzing Non-Smooth Experimental Data

In summary, the person speaking needs help with analyzing their data set that is not smooth. They tried fitting it with a polynomial equation and smoothing it with a spline, but got different results. They are asking for advice on the best and most credible way to analyze their data. The speaker also mentions that it may be difficult to draw any credible conclusions since they have already tried multiple methods.
  • #1
t.m.p.c
1
0
Hi,

I need your help. From experiments I got data set which I need to analyze. The problem is that my data is not smooth. I tried to fit my data using a polynomial equation, but the fitting was not good enough. I also tried to smooth, spline... but got very different final results. Can anyone tell me which will be the best way (the more credible) to analyze my data?
Thanks in advance.

Regards,

tmpc
 
Physics news on Phys.org
  • #2
Question #1: Does theoretically analyzing the data source suggest how the results should be distributed?

Question #2: How are you judging whether or not the fit is good enough?

Question #3: Do you have all of the variables accounted for?

Question #4: Have you tried anything exponential or logarithmic?


Some bad news: at this point, since you've been trying lots of things, it might not be possible to credibly analyze your data*. Finding a fit after a dozen false starts is far less significant than finding a fit on the first try. It might be that the best you can do is, once you find something that fits your current data, and then do a new experiment to (in)validate how well it works.

Maybe you can do some tricks to salvage this dataset, like using 75% of the data to find a fit and 25% to (in)validate it... but I'm not the person who can judge such things.



*: More accurately, it might not be possible to draw any credible conclusions. The analysis "we tried X, Y, and Z to analyze the data, without success" is definitely credible and accurate. It is important to know things like "these variables aren't linearly related", although it's not a flashy result.
 
Last edited:
  • #3



Hi tmpc,

Thank you for reaching out for help with your experimental data analysis. It can be challenging to analyze data that is not smooth and it is important to use the most credible method to ensure accurate results.

One option you could consider is using a non-parametric smoothing method, such as a moving average or a lowess smoother. These methods do not assume a specific mathematical form for your data and can be effective in handling non-smooth data.

Another approach is to use robust regression techniques, which are designed to handle outliers and non-smooth data. These methods can help improve the accuracy of your results compared to traditional methods like polynomial fitting.

Ultimately, the best method for analyzing your data will depend on the specific characteristics of your data and the goals of your analysis. I would recommend consulting with a statistician or data analyst who can help you select the most appropriate method for your specific data set.

I hope this helps and good luck with your data analysis!

 

Related to Optimal Approach for Analyzing Non-Smooth Experimental Data

1. What is experimental data analysis?

Experimental data analysis is the process of using statistical methods and techniques to interpret and draw conclusions from data collected through experiments in a scientific study.

2. Why is experimental data analysis important?

Experimental data analysis is important because it allows scientists to make sense of the data they have collected and determine whether their hypotheses are supported or not. It also helps to identify any patterns or trends in the data, leading to new insights and discoveries.

3. What are the steps involved in experimental data analysis?

The steps involved in experimental data analysis include data cleaning, data exploration, data visualization, hypothesis testing, and drawing conclusions. Data cleaning involves removing any errors or outliers from the data, while data exploration involves looking for patterns and trends in the data. Data visualization is used to present the data in a visual format, making it easier to interpret. Hypothesis testing involves using statistical tests to determine if there is a significant relationship between variables. Lastly, drawing conclusions involves interpreting the results and making inferences based on the data.

4. What are some common statistical methods used in experimental data analysis?

Some common statistical methods used in experimental data analysis include t-tests, ANOVA, regression analysis, and chi-square tests. T-tests are used to compare means between two groups, while ANOVA is used to compare means between multiple groups. Regression analysis is used to determine the relationship between two or more variables, while chi-square tests are used to analyze categorical data.

5. How do you ensure the validity and reliability of experimental data analysis?

To ensure the validity and reliability of experimental data analysis, it is important to have a well-designed and controlled experiment. This includes clearly defining the research question, having a large and representative sample size, and minimizing any confounding variables. It is also important to use appropriate statistical methods and to conduct a thorough analysis of the data. Furthermore, peer review and replication of the study by other researchers can help to validate the findings.

Similar threads

  • Other Physics Topics
Replies
3
Views
939
  • STEM Educators and Teaching
Replies
5
Views
815
  • Other Physics Topics
Replies
9
Views
3K
Replies
2
Views
8K
Replies
3
Views
990
  • DIY Projects
Replies
33
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
12
Views
3K
  • Other Physics Topics
Replies
1
Views
978
Replies
5
Views
836
Back
Top