High persistence in estimates

Hello everybody,

I have been estimating my model, and no matter how much I modify the data, I always obtain some autoregressive coefficient very close to 1. So what I did was truncate the Beta distribution to 0.99 so that it did not exceed this value.

Review the eigenvalues ​​of the model, and none have an exact value of 1.

I tried with the linear model, non-linear model, the one-sided hp-filter, among other things, and I still get high persistence. So what is this problem due to?

In the last estimation, I used a non-linear model, and as observation equations, I used the first differences of the several time series. I demeaned the data, and I used the Dickey-Fuller test to ensure they did not have unit roots and no series has this problem.

For this last estimate, I obtained three parameters (\rho^C, \rho^L, \rho^P) corresponding to the coefficient of an autoregressive process at the top of the distribution, that is, 0.99.

Checking in the forum, I saw that it could be due to errors in the specifications of the observation equations.

To make sure this is not the problem, what I did with most of the series was the following:

  1. I divided the time series by the population aged 16 and over.

  2. I adjusted the resulting series in (1) for seasonality.

  3. I got the base 10 logarithms of the resulting series in (2) and multiplied by 100.

  4. Finally, I subtracted the mean of the process from each observation, to obtain series of zero mean.

  5. The specification of the observation equation (for example of consumption) is as follows:
    C_{obs} = 100\times log(C_{t}/C_{t-1})

If all these steps were correct, what could be wrong with my model?

I attach the necessary files

Database.mat (8.7 KB)
Estimation2_mh_mode.mat (20.8 KB)
Estimation3.log (12.0 KB)
Estimation3.mod (24.1 KB)
Estimation3_mode.mat (20.7 KB)

  1. You must use the natural logarithm, not the base 10 one.
  2. Have you tried to compare the means and standard deviations of the model simulated with what you consider a sensible calibration to the moments from the data you are feeding in? That might give you a clue why a unit root is favored by the data.

I was doing this exercise that you recommended, and apparently, the public debt is the problem, in particular the parameter associated with this relationship:

\frac{1 + i^*_{t}}{1 + i_{ss}} = (\frac{D_{t}}{D_{ss}})^{\kappa}\times \exp(\epsilon_{t})

Which, following the SGU 2003 model, takes very small values ​​(0.0001). However, this doesn’t work with my data, so I moved the mean of the distribution closer to 0.01. With this at least one of them is no longer 0.99 but 0.983

I appreciate your comment. However, I have another doubt, the values ​​of the AR (1) of the other two coefficients that were obtained in the calculation of the mode have a value of 0.00 in the posterior mode, however in the posterior mean they were approximately 0.985, is this still a problem?

  1. Why don’t you try to estimate that parameter?
  2. Is there a theoretical justification for that particular equation? SGU (2003) was about technical tricks to get the model stationary.
  3. If the mode and the mean differ so much, there is usually a convergence issue. Have you had a look at the trace plots?