ARIMA(p,d,q)+XtARIMA(p,d,q)+X_t, Simulation over Forecasting period

I have time series data and I used an ARIMA(p,d,q)+Xt as the model to fit the data. The Xt is an indicator random variable that is either 0 (when I don’t see a rare event) or 1 (when I see the rare event). Based on previous observations that I have for Xt , I can develop a model for Xt using Variable Length Markov Chain methodology. This enables me to simulate the Xt over the forecasting period and gives a sequence of zeros and ones. Since this is a rare event, I will not see Xt=1 often. I can forecast and obtain the prediction intervals based on the simulated values for Xt.


How can I develop an efficient simulation procedure to take into account the occurrence of 1’s in the simulated Xt over the forecasting period? I need to obtain the mean and the forecasting intervals.

The probability of observing 1 is too small for me to think that the regular Monte Carlo simulation will work well in this case. Maybe I can use “importance sampling”, but I am not sure exactly how.

Thank you.


Firstly we consider a more general case. Let Y=Y(A,X), where AfA() and XfX(). Then, assuming the support of gx() dominates the one of fX() and all the integrals below exist, we have:

In your case
and gX() can be defined like this:
Therefore, you can simulate X via distribution gX(), but all the observations with X=1 will have the weight p0.5=2p and all the observations with X=0 will have the weight 1p0.5=2(1p). Simulation of the ARIMA process will not be affected.

Source : Link , Question Author : Stat , Answer Author : LeonM

Leave a Comment