Stationary time series with seasonality...

  • Context: Undergrad 
  • Thread starter Thread starter fog37
  • Start date Start date
  • Tags Tags
    Time series
Click For Summary

Discussion Overview

The discussion revolves around the characteristics of stationary time series, particularly in the context of seasonality, trend, and cyclic components. Participants explore whether a time series with these characteristics can still be considered stationary and discuss methods for achieving stationarity through differencing and decomposition.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants assert that a time series with trend, seasonality, and cyclic components is automatically nonstationary, while others suggest that it is possible to have a stationary time series with a seasonal component.
  • There is a discussion on the use of differencing to remove trends and seasonality, with specific methods proposed, such as first and second differencing.
  • One participant questions the necessity of differencing when the trend and seasonal components can be directly subtracted from the time series.
  • Another participant highlights that if the trend and seasonal components are known, the analysis may become trivial, as the focus is typically on unknown functions.
  • Participants discuss the implications of using ARIMA and SARIMA models, noting that these models often require stationary series and may involve transformations to achieve stationarity.
  • There is mention of using statistical tools like R for time series analysis, including functions for decomposing time series into trend, seasonal, and irregular components.
  • Some participants express uncertainty about the best method to achieve stationarity, considering the differences in results from subtraction versus differencing.
  • Concerns are raised about how random error terms accumulate in different models, affecting the choice of method for achieving stationarity.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether a time series with seasonality can be stationary. Multiple competing views on the methods for achieving stationarity and the implications of known components remain unresolved.

Contextual Notes

Limitations include the dependence on definitions of stationarity and the assumptions regarding the nature of the trend and seasonal components. The discussion also reflects uncertainty about the mathematical implications of different detrending methods.

fog37
Messages
1,566
Reaction score
108
TL;DR
seasonal component and stationarity
Hello,
I was under the impression that a time series with trend, seasonality, and cyclic component would automatically be nonstationary.
Stationarity means constant mean, variance, and autocorrelation (weakly stationary).

However, it seems that we could have a stationary time-series that has a seasonal component....How is that possible? A seasonal component is simply a fluctuating, periodic signal with constant period (Ex: sine wave)....Wouldn't the seasonal component make the signal appear statistically different if our window of time catches the series during the upswing and during the downswing of the seasonal component?

Thank you!
 
Physics news on Phys.org
fog37 said:
TL;DR Summary: seasonal component and stationarity

Hello,
I was under the impression that a time series with trend, seasonality, and cyclic component would automatically be nonstationary.
Stationarity means constant mean, variance, and autocorrelation (weakly stationary).

However, it seems that we could have a stationary time-series that has a seasonal component...
I don't think that ARIMA or SARIMA are considered to be stationary. For their analysis, they are often related to stationary time series by removing the seasonality and trend.
 
  • Like
Likes   Reactions: DeBangis21 and fog37
Thank you. I learned that:

To remove linear trend, we can use 1st differencing ##D_1 = y_t - y_{t-1}##
To remove quadratic trend, we can use 2nd differencing: ##D_2##
To remove seasonality, we can also use the differencing but the order of differencing must match the period of the seasonal component. For example, if the seasonality, after inspection, has period 12 (days, months, etc.), we remove seasonality by using ##D_{12} = y_t - y_{t-12}##..

Here we are again to ARMA vs ARIMA vs SARIMA. If we know our signal has both linear trend and 12-month seasonality, we could either:

a) stationarize ##y_t## by applying ##D_1## and ##D_{12}## to get the (trend free and seasonality free) stationary signal ##g_t##, build an ARMA model with it, make predictions, and finally inverse transform the predictions so they apply correctly to the original data

b) after data visualization, directly use the nonstationary ##y_t## with trend and seasonality to build a SARIMA model taking into a account that the integration should be ##I(1)## to remove the linear trend and specifying the ##12## for the differencing/removing of the seasonal part. Once we fit the SARIMA model, we make predictions but still need to inverse transform those predictions so they apply the original data ##y_t##....

The outcomes for approach a) and b) should be exactly the same...
 
Last edited:
FactChecker said:
I don't think that ARIMA or SARIMA are considered to be stationary. For their analysis, they are often related to stationary time series by removing the seasonality and trend.
Hello FactCheker,

I have been thinking: a nonstationary time series ##y(t)## can be decomposed into the sum of its trend ##T(t)## and seasonality ##S(t)##:
$$y(t)=T(t)+S(t)+error(t)$$
To trend-stationarize the series y(t) (i.e. remove the trend), we usually difference the series... why don't we just perform the calculation##g(t) = y(t) -T(t)## instead of differencing the series ##y(t)##?
Subtracting ##T(t)## would seem to perfectly take care of the trend removal. The new series ##g(t)## is now trend-free.
The same goes for removing the seasonality ##S(t)##: ##g(t) = y(t) -T(t)- S(t)##. Of course, we only remain with the ##error(t)##

So what are we left with when we using differencing on ##y(t)## to get a signal ##g(t)## that is trend and seasonality free?

Thank you,
Brett
 
If you already know what T and S are, what analysis is even left for you to do? Usually the point is you don't actually know what the function is.
 
  • Like
Likes   Reactions: FactChecker and fog37
Office_Shredder said:
If you already know what T and S are, what analysis is even left for you to do? Usually the point is you don't actually know what the function is.
Well, the idea is generally to use that time series to build a statistical model like AR, MA, ARMA, ARIMA, etc. These models require the series used to build them to be stationary. So if our series is not stationary, we need to make it stationary via transformations like differencing....

However, during exploratory analysis, we can use functions (ex: in Python) that can separate out the components so we know what kind of time series we are dealing with and figure out the appropriate transformations to stationarize it:

1706706946702.png

1706706927844.png
 
  • Like
Likes   Reactions: DeBangis21
fog37 said:
Well, the idea is generally to use that time series to build a statistical model like AR, MA, ARMA, ARIMA, etc. These models require the series used to build them to be stationary. So if our series is not stationary, we need to make it stationary via transformations like differencing....

However, during exploratory analysis, we can use functions (ex: in Python) that can separate out the components so we know what kind of time series we are dealing with and figure out the appropriate transformations to stationarize it:

View attachment 339501
View attachment 339499
What tools or processes are you considering using to analyze a time series? R is a popular statistical package. It has a function, decompose, which helps you to see the trend, seasonal, and irregular components. (see this) But it seems that their time series analysis function, acf, which determines autocorrelations works on the original time series. I have never used R for a complete time series analysis, so I will have to leave this for others to say more.
 
  • Like
Likes   Reactions: fog37
FactChecker said:
What tools or processes are you considering using to analyze a time series? R is a popular statistical package. It has a function, decompose, which helps you to see the trend, seasonal, and irregular components. (see this) But it seems that their time series analysis function, acf, which determines autocorrelations works on the original time series. I have never used R for a complete time series analysis, so I will have to leave this for others to say more.
We find the trend ##T(t)## using regression (linear, polynomial, etc) and subtract it from the series ##y(t)##:
$$g(t) = y(t)- T(t)$$
OR we do differencing, i.e. we get $$g(t)= y(t) - y(t-1)$$...

The results are different but both ##g(t)## are trend free now...Which one to use to make ##y(t)## stationary mathematically?
 
fog37 said:
We find the trend ##T(t)## using regression (linear, polynomial, etc) and subtract it from the series ##y(t)##:
$$g(t) = y(t)- T(t)$$
OR we do differencing, i.e. we get $$g(t)= y(t) - y(t-1)$$...

The results are different but both ##g(t)## are trend free now...Which one to use to make ##y(t)## stationary mathematically?
It seems more complicated than that. I think they are not necessarily trend free. You can combine the two like ##g_i = y_i - y_{i-1} + c## and need to apply both "detrending" steps.
Also, you need to consider how the random error terms enter in. Consider the difference in how the random term accumulates in these two models:
##y_i = g_i+T_i + \epsilon_i## versus ##y_i=g_i+y_{i-1} +\epsilon_i##, where ##g## is a simple stationary time series and ##\epsilon## is a stationary random distribution.
In the first model, the random component is a single random sample from ##\epsilon##. In the second model, the random component accumulates, summing all the ##\epsilon_j##s associated with the prior ##y_j## values.
Which model is most appropriate is something for you to decide, based on the data or on the subject matter.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 1 ·
Replies
1
Views
11K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
5K
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
7K