Dear Prof Johannes,
I want to ask stoch_simul matter as I calibrate an RBC model using parameter values estimated by annual data.
I do simulation to find how much actual moments can be explained by the model. I’ve got confusion about simulation.
Should I set periods=xxx in stoch_simul command corresponding to the period of observed data ?
And, do I need to set hp_filter = 100 for annual data?
Because of differing period setting results in different moments. And sometime, run simulation with periods = 150 means that only 50 periods are used to compute simulated moments due to 100 burn-in, it is not enough for a sound result. But the dataset does have only 45 years so it is not clear to run simulation over 200 - 500 periods. I am not very sure to simulate 500 periods then compare with moments with 45 year data!
One more thing, I do filter data using one-sided filter but Dynare only offers Hp filter for simulation. I tried to get simulation data without hpfilter and apply one-sided one but the standard deviation is very low. The same thing happens as I try to do with HP filter.
How can I apply filter other than HP with stoch_simul command?
I hope to have your support. Thanks a lot!
Update: I now know the new feature of unstable Dynare supports for one-sided filter. That’s a good thing!
The remaining question should we simulate a thousand periods or just set replic=1000 and periods = drop + number of actual periods to be evaluated.
Hope to get an explanation!
You need to be more specific on what exactly you are trying to do. I infer that you are using annual data. Is the model set up in annual of quarterly frequency? That determines whether you need to aggregate the model simulations from quarterly to annual frequency.
If your simulation moments are very dependent on the sampe length, you should try to either increase the sample length or use various repetititions of the original sample length.
Yes, you should use a burnin at the beginning of your simulations.
A typical setup would be to use 1000 periods total, but drop everything at the beginning except for as many periods as the data have. Repeat this 300 times or so with
The one-sided HP-filter is implemented in the unstable version via the
If filtered and data moments do not match, there is still a problem with your model (or the data treatment)
Dear Prof Johannes,
Thank you for your support.
I have followed your guide and Hansen model for learning code.
I am working with annual data. In estimation model, I detrend data with one-sided filter with lambda = 100 for annual data.
So, I think I have to do the same thing with simulated data as it should be detrened with the same filter.
The manual says
[quote]simul_replic = INTEGER
Number of series to simulate when empirical moments are requested (i.e. periods > 0). Note that if this option is greater than 1, the additional series will not be used for computing the empirical moments but will simply be saved in binary form to the file ‘FILENAME_simul’. Default: 1.[/quote]
This means only one simulated series used to compute empirical moments ? Or we have to manually extract simulation data and compute the moments?
Is is feasible if my actual std_dev of output = 2 but the simulated one varies from 1.45 to 1.6 depending on simulation length?
Thanks for your answer and support,
If your model is also annual, then yes, lambda=100 is OK. That is the value Backus/Kehoe (1992) use. Ravn/Uhlig (2002) recommend 6.25, while Baxter/King (1999) recommend 10. Another common value for annual data is 400.
Yes. data and model should be filtered in the same way.
Yes, Dynare’s moments are based on one replication. The Hansen_1985.mod shows how to manually compute moments based on more replications.
Yes, it is normal for variances to fluctuate with simulation length. They will stabilize for long series. And yes, the model rarely exactly hits the data moments (unless you have targeted moments; those should be (almost) perfectly matched if your model is just identified)
Thank you very much professor! Your explanations are so clear.