R seasonal time series

I use the decompose function in R and come up with the 3 components of my monthly time series (trend, seasonal and random). If I plot the chart or look at the table, I can clearly see that the time series is affected by seasonality.

However, when I regress the time series onto the 11 seasonal dummy variables, all the coefficients are not statistically significant, suggesting there is no seasonality.

I don’t understand why I come up with two very different results. Did this happen to anybody? Am I doing something wrong?


I add here some useful details.

This is my time series and the corresponding monthly change. In both charts, you can see there is seasonality (or this is what I would like to assess). Especially, in the second chart (which is the monthly change of the series) I can see a recurrent pattern (high points and low points in the same months of the year).

TimeSeries

MonthlyChange

Below is the output of the decompose function. I appreciate that, as @RichardHardy said, the function does not test whether there is actual seasonality. But the decomposition seems to confirm what I think.

Decompose

However, when I regress the time series on 11 seasonal dummy variables (January to November, excluding December) I find the following:

    Coefficients:
                  Estimate Std. Error t value Pr(>|t|)    
    (Intercept) 5144454056  372840549  13.798   <2e-16 ***
    Jan     -616669492  527276161  -1.170    0.248    
    Feb     -586884419  527276161  -1.113    0.271    
    Mar     -461990149  527276161  -0.876    0.385    
    Apr     -407860396  527276161  -0.774    0.443    
    May     -395942771  527276161  -0.751    0.456    
    Jun     -382312331  527276161  -0.725    0.472    
    Jul     -342137426  527276161  -0.649    0.520    
    Aug     -308931830  527276161  -0.586    0.561    
    Sep     -275129629  527276161  -0.522    0.604    
    Oct     -218035419  527276161  -0.414    0.681    
    Nov     -159814080  527276161  -0.303    0.763

Basically, all the seasonality coefficients are not statistically significant.

To run linear regression I use the following function:

lm.r = lm(Yvar~Var$Jan+Var$Feb+Var$Mar+Var$Apr+Var$May+Var$Jun+Var$Jul+Var$Aug+Var$Sep+Var$Oct+Var$Nov)

where I set up Yvar as a time series variable with monthly frequency (frequency = 12).

I also try to take into account the trending component of the time series including a trend variable to the regression. However, the result does not change.

                  Estimate Std. Error t value Pr(>|t|)    
    (Intercept) 3600646404   96286811  37.395   <2e-16 ***
    Jan     -144950487  117138294  -1.237    0.222    
    Feb     -158048960  116963281  -1.351    0.183    
    Mar      -76038236  116804709  -0.651    0.518    
    Apr      -64792029  116662646  -0.555    0.581    
    May      -95757949  116537153  -0.822    0.415    
    Jun     -125011055  116428283  -1.074    0.288    
    Jul     -127719697  116336082  -1.098    0.278    
    Aug     -137397646  116260591  -1.182    0.243    
    Sep     -146478991  116201842  -1.261    0.214    
    Oct     -132268327  116159860  -1.139    0.261    
    Nov     -116930534  116134664  -1.007    0.319    
    trend     42883546    1396782  30.702   <2e-16 ***

Hence my question is: am I doing something wrong in the regression analysis?

Answer

Are you doing the regression on the data after you’ve removed the trend? You have a positive trend, and your seasonal signature is likely masked in your regression (variance due to trend, or error, is larger than due to month), unless you’ve accounted for the trend in Yvar…

Also, I’m not terribly confident with time series, but shouldn’t each observation be assigned a month, and your regression look something like this?

lm(Yvar ~ Time + Month)

Apologies if that makes no sense… Does regression make the most sense here?

Attribution
Source : Link , Question Author : mattiace , Answer Author : danno

Leave a Comment