Shocks in historical decomposition graphs too volatile

Hi there,

I’ve estimated a NK model with housing and the estimation results seem to be OK. However, I never managed to understand why the shocks in the historical decomposition graphs look way off the chart. Basically one cannot really expect that they are in reality so volatile and in my view, some of them look serially correlated. Rather, I guess the model adjusts them in such a way as to fit the smoothed variables. Any idea why this might be happenning? One of our co-authors believes it might be due to the lack of capital investment in the model (we’ve got only housing investment, but no capital in the production function), but I’m not quite sure whether that’s really the reason but rather a technical issue maybe.

You can find attached the historical decomposition graphs as well as a pdf file with all possible graphs and tables (mode check plots, convergence plots, variance decomposition table, etc.).

Thank you very much in advance,
Best,
Petar
TABLERUN1.pdf (621 KB)
Historical decomposition graphs.pdf (325 KB)

I would focus on trying to understand what is going on with

Just look at its mode_check and prior/posterior plots. There must be something wrong, because it almost gets a unit root. Did you use

prior_trunc=0?
the large persistence in this shock, visible in the IRFs to e_mud might also explain why you get such large swings.

Hi Jochannes,
Thank you for the prompt reply. So basically the “muc” and “mud” are stochastic processes for the cost-push shocks in both NK Phillips Curves for house prices and for consumption prices in the spirit of Smeths and Wolters 2007. It’s a bit strange that the autoregressive parameter of the stochastic process for the first Phillips curve is well identified, whereas the second one isn’t. I’m not doing anything special regarding the stochastic processes, they’re all AR(1). I multiply all observables by 100 but that shouldn’t have any impact I guess.

I haven’t “prior_trunc=0” so far neither have I come accross it, but thatks for the suggestion, I will give it a try to see what will happen.

Please find attached my dyanre code and an updated table, I have a MATLAB code which generates a LaTeX one so sometimes the titles don’t quite match. Now everything should be all right regarding the table.

Best,
Petar
modellast1.mod (6.36 KB)
TABLERUN1.pdf (623 KB)

The data is missing.

You should try the LaTeX capabilities of the Dynare unstable version. It automates most of what you are doing.

Sorry, forgot to upload it.

So here’s how I do the variable transformation.

  1. Variables such as output, consumption, real housing investment and total hours worked I first logarthimize, then detrend using the one-sided HP filter, then multiply the cyclical component by 100 and use it as an observable.
  2. The Interest rate I first transform it into quarterly series (1 + Rannual/(4*100)), then take the log of it and finally demean it.
  3. Inflation rates I only demean.

Hope what I’m doing is OK.

Now have to figure out what extension is allowed. Just a second.

There you go. And again many thanks.
dataset_nz.xls (77 KB)

  1. When I run identification on the model, I get that

is not identified.
2. You are using a linear model, but the data is not mean 0. Is that intended?

Hi Johannes,

So regarding this issue I detrend the one-sided HP filter so in finite samples the mean won’t be zero I guess. I advised my supervisor about it since at first I always demeaned the cyclical component of the series due to this problem. However, he said that this shouldn’t be a problem since then the shocks won’t be mean-zero and that’s it pretty much. So I proceeded like that. But I’m not sure whether that’s still OK. What would you recommend?

Best,
Petar

That sounds OK. Given that you multiplied with 100, the mean was small in any case. I was just making sure.