Stationary time series with seasonality...

  • I
  • Thread starter fog37
  • Start date
  • Tags
    Time series
  • #1
fog37
1,568
108
TL;DR Summary
seasonal component and stationarity
Hello,
I was under the impression that a time series with trend, seasonality, and cyclic component would automatically be nonstationary.
Stationarity means constant mean, variance, and autocorrelation (weakly stationary).

However, it seems that we could have a stationary time-series that has a seasonal component....How is that possible? A seasonal component is simply a fluctuating, periodic signal with constant period (Ex: sine wave)....Wouldn't the seasonal component make the signal appear statistically different if our window of time catches the series during the upswing and during the downswing of the seasonal component?

Thank you!
 
Physics news on Phys.org
  • #2
fog37 said:
TL;DR Summary: seasonal component and stationarity

Hello,
I was under the impression that a time series with trend, seasonality, and cyclic component would automatically be nonstationary.
Stationarity means constant mean, variance, and autocorrelation (weakly stationary).

However, it seems that we could have a stationary time-series that has a seasonal component...
I don't think that ARIMA or SARIMA are considered to be stationary. For their analysis, they are often related to stationary time series by removing the seasonality and trend.
 
  • Like
Likes DeBangis21 and fog37
  • #3
Thank you. I learned that:

To remove linear trend, we can use 1st differencing ##D_1 = y_t - y_{t-1}##
To remove quadratic trend, we can use 2nd differencing: ##D_2##
To remove seasonality, we can also use the differencing but the order of differencing must match the period of the seasonal component. For example, if the seasonality, after inspection, has period 12 (days, months, etc.), we remove seasonality by using ##D_{12} = y_t - y_{t-12}##..

Here we are again to ARMA vs ARIMA vs SARIMA. If we know our signal has both linear trend and 12-month seasonality, we could either:

a) stationarize ##y_t## by applying ##D_1## and ##D_{12}## to get the (trend free and seasonality free) stationary signal ##g_t##, build an ARMA model with it, make predictions, and finally inverse transform the predictions so they apply correctly to the original data

b) after data visualization, directly use the nonstationary ##y_t## with trend and seasonality to build a SARIMA model taking into a account that the integration should be ##I(1)## to remove the linear trend and specifying the ##12## for the differencing/removing of the seasonal part. Once we fit the SARIMA model, we make predictions but still need to inverse transform those predictions so they apply the original data ##y_t##....

The outcomes for approach a) and b) should be exactly the same...
 
Last edited:
  • #4
FactChecker said:
I don't think that ARIMA or SARIMA are considered to be stationary. For their analysis, they are often related to stationary time series by removing the seasonality and trend.
Hello FactCheker,

I have been thinking: a nonstationary time series ##y(t)## can be decomposed into the sum of its trend ##T(t)## and seasonality ##S(t)##:
$$y(t)=T(t)+S(t)+error(t)$$
To trend-stationarize the series y(t) (i.e. remove the trend), we usually difference the series... why don't we just perform the calculation##g(t) = y(t) -T(t)## instead of differencing the series ##y(t)##?
Subtracting ##T(t)## would seem to perfectly take care of the trend removal. The new series ##g(t)## is now trend-free.
The same goes for removing the seasonality ##S(t)##: ##g(t) = y(t) -T(t)- S(t)##. Of course, we only remain with the ##error(t)##

So what are we left with when we using differencing on ##y(t)## to get a signal ##g(t)## that is trend and seasonality free?

Thank you,
Brett
 
  • #5
If you already know what T and S are, what analysis is even left for you to do? Usually the point is you don't actually know what the function is.
 
  • Like
Likes FactChecker and fog37
  • #6
Office_Shredder said:
If you already know what T and S are, what analysis is even left for you to do? Usually the point is you don't actually know what the function is.
Well, the idea is generally to use that time series to build a statistical model like AR, MA, ARMA, ARIMA, etc. These models require the series used to build them to be stationary. So if our series is not stationary, we need to make it stationary via transformations like differencing....

However, during exploratory analysis, we can use functions (ex: in Python) that can separate out the components so we know what kind of time series we are dealing with and figure out the appropriate transformations to stationarize it:

1706706946702.png

1706706927844.png
 
  • Like
Likes DeBangis21
  • #7
fog37 said:
Well, the idea is generally to use that time series to build a statistical model like AR, MA, ARMA, ARIMA, etc. These models require the series used to build them to be stationary. So if our series is not stationary, we need to make it stationary via transformations like differencing....

However, during exploratory analysis, we can use functions (ex: in Python) that can separate out the components so we know what kind of time series we are dealing with and figure out the appropriate transformations to stationarize it:

View attachment 339501
View attachment 339499
What tools or processes are you considering using to analyze a time series? R is a popular statistical package. It has a function, decompose, which helps you to see the trend, seasonal, and irregular components. (see this) But it seems that their time series analysis function, acf, which determines autocorrelations works on the original time series. I have never used R for a complete time series analysis, so I will have to leave this for others to say more.
 
  • Like
Likes fog37
  • #8
FactChecker said:
What tools or processes are you considering using to analyze a time series? R is a popular statistical package. It has a function, decompose, which helps you to see the trend, seasonal, and irregular components. (see this) But it seems that their time series analysis function, acf, which determines autocorrelations works on the original time series. I have never used R for a complete time series analysis, so I will have to leave this for others to say more.
We find the trend ##T(t)## using regression (linear, polynomial, etc) and subtract it from the series ##y(t)##:
$$g(t) = y(t)- T(t)$$
OR we do differencing, i.e. we get $$g(t)= y(t) - y(t-1)$$...

The results are different but both ##g(t)## are trend free now...Which one to use to make ##y(t)## stationary mathematically?
 
  • #9
fog37 said:
We find the trend ##T(t)## using regression (linear, polynomial, etc) and subtract it from the series ##y(t)##:
$$g(t) = y(t)- T(t)$$
OR we do differencing, i.e. we get $$g(t)= y(t) - y(t-1)$$...

The results are different but both ##g(t)## are trend free now...Which one to use to make ##y(t)## stationary mathematically?
It seems more complicated than that. I think they are not necessarily trend free. You can combine the two like ##g_i = y_i - y_{i-1} + c## and need to apply both "detrending" steps.
Also, you need to consider how the random error terms enter in. Consider the difference in how the random term accumulates in these two models:
##y_i = g_i+T_i + \epsilon_i## versus ##y_i=g_i+y_{i-1} +\epsilon_i##, where ##g## is a simple stationary time series and ##\epsilon## is a stationary random distribution.
In the first model, the random component is a single random sample from ##\epsilon##. In the second model, the random component accumulates, summing all the ##\epsilon_j##s associated with the prior ##y_j## values.
Which model is most appropriate is something for you to decide, based on the data or on the subject matter.
 

What is a stationary time series with seasonality?

A stationary time series with seasonality is a type of time series where the statistical properties such as mean, variance, and autocorrelation are all constant over time but include patterns that repeat at consistent intervals, known as seasonal effects. This combination allows for both predictable seasonal patterns and the assumption of statistical constancy required for many analytical techniques.

How can you test for stationarity in a time series with seasonality?

To test for stationarity in a seasonal time series, you can use statistical tests such as the Augmented Dickey-Fuller (ADF) test or the Kwiatkowski-Phillips-Schmidt-Shin (KPSS) test. Before performing these tests, it may be necessary to seasonally adjust the data or use seasonal differencing to remove the seasonal effects and focus on the underlying stationarity of the data.

What methods are used to handle seasonality in time series data?

Common methods to handle seasonality in time series data include seasonal differencing, where you subtract the value of a previous season to remove seasonal effects, and using seasonal decomposition techniques such as Seasonal Decomposition of Time Series by Loess (STL) or X-13ARIMA-SEATS. These methods help in isolating and adjusting the seasonal components to better analyze the underlying trends and cycles.

How does seasonality affect forecasting in time series analysis?

Seasonality can significantly impact forecasting in time series analysis as it introduces regular and predictable patterns that must be accounted for in the forecasting models. Failure to accurately model seasonal variations can lead to large forecasting errors. Techniques like SARIMA (Seasonal AutoRegressive Integrated Moving Average) are specifically designed to incorporate both non-seasonal and seasonal factors in a unified modeling approach.

Can a time series be both stationary and have seasonality?

Yes, a time series can be both stationary and have seasonality. Stationarity refers to the statistical properties of the series such as mean and variance being constant over time, which can still hold true in the presence of seasonality if the seasonal fluctuations are consistent and predictable over time. Properly accounting for these seasonal patterns allows the non-seasonal part of the series to be analyzed for stationarity.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
949
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
462
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Electrical Engineering
Replies
4
Views
835
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
11K
  • Calculus
Replies
8
Views
4K
Replies
1
Views
2K
  • Special and General Relativity
Replies
6
Views
2K
Back
Top