Some questions about dynare++ 1.3.5

Dear all,
I have switched from dynare++ 1.3.3 to 1.3.5 and I have a couple of questions.

  1. When I used dynare++ 1.3.3, I didn’t need to specify initial values for the variables of varexo. With the same program and by using dynare++ 1.3.5, I have the following error message :

*dynare++: warning: initval for <u_a_e> is not finite
Caught Dynare exception: nlsolve.cpp:188: Initial guess does not yield finite residual in NLSolver::solve *

So, I’m right by saying that I need now to specify
*u_a = 1; * in initval, where u_a is the variable in vaexo ??

  1. If I run the same programm (named for instance my_pgrm.mod) with the two versions of dynare++, the matrix of results my_pgrm.mat has not the same structure.
    Precisely, I have less MULT and AUX variables in dynare++ 1.3.5 than in in dynare++ 1.3.3 and thus, the size of dyn_vars is not the same. In addition, the order of the state variables is not the same in the dyn_state_vars according to the used version.
    I don’t understand why I have these differences.

Finally, I have a last question which is not related to the version of dynare++ :

  1. I would like to simulate my Ramsey model by using the command dynare_simul. Precisely, I’m interested to a stochastic simulation. If I use the dynare_simul help, I have to write

shocks = zeros(nbshock,100)./0; % put NaNs everywhere
r = dynare_simul(‘your_model.mat’,shocks);
and it is stated that
% NaNs and Infs in the matrix are substitued by draws from the normal distribution using the covariance matrix given in the model file.
But in my case, i would like to have always the same random draw. Is it possible to specify a option in order to have the same draw, in the spirit of the command randn(‘state’,12345) ?
When I use
shocks = randn(nbshock,100);
r = dynare_simul(‘your_model.mat’,shocks);
the program is able to run but the rows of vector r are very big value from period 1 to 9, period 10 is infinite and the other periods are zeros…

Thanks a lot for your help.

Ad 1: Please, send me (or better to the forum) your model file. I will look at it, it seems as a bug.

Ad 2: If version 1.3.5 generated a different set of auxiliary variables and multipliers, then everything is different (in terms of vectors and their orderings). The question is whether the results are the same. I think I corrected an error from 1.3.3 to 1.3.5 regarding forward looking variables and the lagrange multipliers, so things might have been changed.

Ad 3: Your way of generating shocks with a given seed is perfectly OK provided that the variance covariance matrix si a unit matrix. So there are possibly two errors. The first one is that you forgot to multiply the shocks matrix with a factor of vcov matrix (for example cholesky), or your vcov matrix is unit (and then your matlab code is correct) but your model is non-stationary.

Ondra K.

Thanks Ondra for your reply,

  1. find in attached file a simple example for the question #1.
  2. I check the matrices of results (dyn_g_1, dyn_g_2, dyn_ss, dyn_vcov), the values of these matrices are not changed with the dynare++ version but there are less rows in these matrices when you use version 1.3.5. If you use your example file kp1980_2.mod, you will see that there are not any AUX variables with version 1.3.5.
  3. Thanks for you advice, you are right, I forgot the Chol matrix, now, it’s working.
    NKmodel.mod (6.2 KB)


Ad 1: I am confirming this is a bug of dynare+±1.3.5. I redesigned automatic substitutions from version 1.3.3 to 1.3.5 because of another bug in optimal policy and forward looking variables with multple leads. Unfortunatelly, I tested more complex input files with initval settings of exogenous variables. Thus the bug you are pointing out has not been revealed.

Obvious workaround for your case is to set exogenous variables to zeros in initval section. I will correct this in version 1.3.6. Setting exogenous variables to zeros will not be necessary since then.

May I include your model file to tests directory and make it a part of a distribution?

Many thanks,

Ondra K.