Nonlinear ML estimation with Dynare 4.6.1

Dear Dynare team, Dear all,

First thanks for all the work done.

I am trying to simulate a simple one equation model with a nonlinear (cubic) term, and then use the simulated data to estimate the model at order 3 using ML.

Using Dynare 4.6.1 and Matlab R2020a, I am unable to get the nonlinear estimation started. Error message is

ESTIMATION_CHECKS: There was an error in computing the likelihood for initial parameter values.
ESTIMATION_CHECKS: If this is not a problem with the setting of options (check the error message below),
ESTIMATION_CHECKS: you should try using the calibrated version of the model as starting values. To do
ESTIMATION_CHECKS: this, add an empty estimated_params_init-block with use_calibration option immediately before the estimation
ESTIMATION_CHECKS: command (and after the estimated_params-block so that it does not get overwritten):

I am attaching Order3Simulate.mod that simulates the model and stores the simulation in DataOrder3.mat (attached) and Order3Estimate.mod that estimates the model first as a linear model and then as an order 3 model.

Any hint on what is going wrong (I cannot think of a simpler example)?

Thanks
Franck

Archive.zip (6.4 KB)

1 Like

Hi and welcome to the community ;o) !
You need to add a measurement error in your measurement equation and then add an extra parameter to estimate (its variance). It is necessary to compute the likelihood in the nonlinear framework.
You can do that by adding the line :
stderr x, 1.0 ;
in the estimated_params block ;
and write after the block :
varobs x;
Think in deleting any other varobs statement.
It will work with order=2 at least (I checked).
With order=3, it generates a crash on my matlab (2016) so I have to check further.
Best,

1 Like

The problem with order=3 seems to be related to the dynare++ library used to compute the reduced form solution at orders higher than two (something is missing in the dr structure). I don’t really understand how the other examples we have can work… Cannot investigate further now. Best, Stéphane.

1 Like

Hi Fred,

thanks a lot.

Indeed, adding measurement error to x allows the code to go through the order three nonlinear estimation (then the hessian is not positive definite at the mode, but that is another story, I did put a short sample (120) to speed up computation).

But it is weird that one needs to add measurement error. That should be the modeller choice. I am guessing that something is weird somewhere else.

Thanks again

Franck

Hi Stéphane,

thanks a lot for your quick answer.

I believe that the “model” I have chosen is not pathological, so that there is something fishy somewhere in the algorithm.

Let me know if at some point you go back to that problem.

ciao

Franck

Hi Franck, Regarding the measurement errors that’s unfortunate but it’s a constraint of the nonlinear filters: we need as many measurement errors as observed variables (while in the linear case the condition to have a non singular likelihood is to have at least as many errors, structural or measurement, as observed variables). The literature does not insist much on it, I guess it’s not a problem for statisticians but it’s quite unpleasing for us (especially if these errors are found to be large). An alternative would be to estimate the model by NLS, but this is not yet implemented in Dynare. I didn’t manage to deal with the multiplicity of the residuals yet (when order>1)… However, it may be that in your model multiplicity is not an issue because the innovation enters linearly in the model… I am not sure, I would need to look at the reduced form.

Best,
Stéphane

1 Like

Ok so if it works at order 3 for you, that means there is something elsewhere… Thank you.

Concerning the hessian, it is not surprising ; if you make a modcheck, you will see the specifity of nonlinear estimation due to the resampling step of the particle filter. It induces a nonsmooth likelihood estimator so you can’t rely on derivative-based optimization algorithms. We add a message for that but I don’t think something has been done there. One advice: increase the number of particles to dampen the nonsmoothness and facilitate convergence (but it will increase the computational time). Otherwise, use the nonlinear Kalman filter (there is no resampling and then smoothness).
At last, concerning the measurement error, I agree on your point but it is a technical condition for writing the likelihood and being able to perform the estimation through ML. Other solutions exist, but it is not ML anymore…
Best,

1 Like

Fred,

Concerning the hessian, it is not surprising ; if you make a modcheck, you will see the specifity of nonlinear estimation due to the resampling step of the particle filter.

Okay. Good.

Thanks!

Franck

Stéphane,

Thanks.

Regarding the measurement errors that’s unfortunate but it’s a constraint of the nonlinear filters: we need as many measurement errors as observed variables (while in the linear case the condition to have a non singular likelihood is to have at least as many errors, structural or measurement, as observed variables). The literature does not insist much on it, I guess it’s not a problem for statisticians but it’s quite unpleasing for us (especially if these errors are found to be large).

Not obvious to me but this is my ignorance. I’ll dig deeper into it.

However, it may be that in your model multiplicity is not an issue because the innovation enters linearly in the model

My "model’ was just a simple device to understand how things work. Not any deep meaning in it.

Thanks again. You guys are truly doing an amazing job.

Franck

@FpjPortier I you have a look at

You will see that you can do without measurement error (Section 2.2/2.3), but that it prevents the software from being easily generalizable to different models and types of particle filters.

1 Like

Thanks Johannes for the reference.

You will see that you can do without measurement error (Section 2.2/2.3), but that it prevents the software from being easily generalizable to different models and types of particle filters.

It makes a to of sense.

Franck