Smoothing algorithm for times series and slope measurement

In summary, the speaker is conducting an experiment with frequent data points and is looking for a smoothing method to follow the overall trends while also capturing small movements. They are considering using splines or lowess and also need to measure the slope of the smoothing line. However, there is some uncertainty about the usefulness of the small ups and downs in the data. The speaker also mentions the possibility of using Minimum Description Length to model the data with a polynomial.
  • #1
nickmath
1
0
Hi,

I'm doing an experiment that gathers one data point approximately each second. I have plotted all the datapoints on a graph. The graph has many tops and bottoms. Now due to the fact that there are so many data points and that I can visualle tell that there are several overal trends: one period my data is going up, the next period it is level again for a period of time and the next period it can be down again. And in each of these periods the data can go up, down or level again, although the overal direction is up, down or level. I have searched for different smoothing methods that can follow all these small movements, but at the same time (using a second smoothing algorithm/parameter) can follow the overal movement. I was thinking about using splines or lowess. Are splines or lowess able to do this?

Second problem I'm having is, after having applied a smoothing algorithm, I need to be able to measure the slope of the smoothing line. I need to know what the slope/curvature of that smoothing line is at any giving moment in time. All the pointers you can give me are really welcome. Thanks.

Kind regards,

Nick
 
Physics news on Phys.org
  • #2
Foremost: I am an undergraduate CS student and therefore my advise should be taken with a skeptical ear.

I think it depends on what you want to use the smoothed values for.
If you want to do some sort of predictions or make a general model of the data I think what you want to do is model the data with Minimum Description Length using the order of a polynomial and a polynomial that fits the data. Doing a spline will probably overfit your data if this is what you want to smooth for.

If you want to make a nicer visualization of the data, spline might do the trick for you. I'm not really sure though - I'm hesitant to agree that it would do a very good job.
 
  • #3
Are you sure that all of these little ups and downs are meaningful data? I guess LOESS would be as good as any method for connecting the dots. If you are interested in long-term trends, you could just fit another LOESS model, with a larger smoothing parameter.
 

1. What is a smoothing algorithm for time series and slope measurement?

A smoothing algorithm is a mathematical method used to reduce noise and fluctuations in time series data, making it easier to identify underlying trends and patterns. It involves replacing each data point with an average of nearby data points, resulting in a smoother curve.

2. Why is a smoothing algorithm important for time series and slope measurement?

Slope measurements and time series data often contain noise and fluctuations that can make it difficult to accurately identify trends and patterns. A smoothing algorithm helps to eliminate this noise, making it easier to visualize and analyze the data.

3. How does a smoothing algorithm work?

A smoothing algorithm works by replacing each data point with an average of nearby data points. The specific method used to determine this average can vary, but it typically involves a weighted average that gives more weight to recent data points.

4. What are the benefits of using a smoothing algorithm?

Using a smoothing algorithm can help to reduce the effects of outliers and noise in time series data, making it easier to identify trends and patterns. It can also make the data easier to interpret and visualize, and can improve the accuracy of slope measurements.

5. Are there any limitations to using a smoothing algorithm?

While a smoothing algorithm can be helpful in reducing noise and improving data interpretation, it can also result in loss of information and potentially distort the original data. It is important to carefully consider the specific algorithm and parameters used to ensure the resulting data is still representative of the original data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
663
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
883
Replies
5
Views
607
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
965
  • STEM Educators and Teaching
Replies
11
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Programming and Computer Science
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
872
Back
Top