Preprocessor fails in large system

I am trying to solve a large system of log-linearized equations (essentially the log-linearized FOCs of a large multi-sector RBC model, model and data are here). The model works fine on smaller samples, however with 1662 equations and~1.800.000 pre-defined paramaters the preprocessor runs into trouble:

Using 64-bit preprocessor
"/usr/lib/dynare/matlab/preprocessor64/dynare_m" KW_replic_reduced.mod minimal_workspace: Killed
Starting Dynare (version 4.5.4).
Starting preprocessing of the model file ...
Found 1662 equation(s).
Evaluating expressions...done
Computing static model derivatives:
- order 1

Error using dynare (line 225)
DYNARE: preprocessing failed

I am wondering:

  1. Is this simply a problem of computational capacity or could it also be due to the parameterization?
  2. It seems to me that the RAM is the limiting factor. Would rerunning it on a stronger computer solve the problem (I am running it on 16GB RAM and Linux)?
  3. Is there anything else I can do about it except reducing the size of the model?

32GB RAM on my machine also seem not to be sufficient. My guess is that it is indeed a RAM problem.

1 Like

Hi,

Preprocessing your mod file with the unstable version of Dynare I had to kill the process which was still running after more than 15 minutes. The process was not using a lot of memory, so I do not think the spec of the computer is the problem. Then I tried reducing the dimension of the problem (ny=100 and nx=50) and I obtained a segmentation fault… So it is more likely that we have a bug in the preprocessor. We are looking into it.

Best,
Stéphane.

@stepan-a On Windows I hit the upper limit in terms of memory.

I could not reproduce the error found by @stepan-a. On macOS the small model (ny==100, nx==50) goes through without a problem, creating a driver.m file of 41MB from a 39MB macroexpanded .mod file.

I am running the original model provided by @lbb on one of our servers (with 24GB RAM) to see if it finishes processing there (though I imagine it won’t since it broke on @jpfeifer’s machine which had 32GB of RAM)

Many thanks for looking into this.

On Linux, a medium-size model ( ny==216, nx==108 ), goes through without problem as well. I am currently running the large model on my school’s linux server, it manages the preprocessing and is busy with ‘MATLAB/Octave computing’ since a number of days.

Dear @lbb,

I had a look to this issue, and I could not identify a bug in the preprocessor.

The issue is rather that you are trying to solve a class a problem for which the preprocessor is not optimized, and therefore it takes a lot of time.

Essentially, you have a model written in matrix form, in which all variables appear in all equations. Therefore your equations are very large, and the preprocessor has difficulty handling them.

The preprocessor is rather optimized for models where only a small number of variables enter each equation, as is typically the case with structural economic models (and it can easily handle models with several thousands of such equations).

Optimizing the preprocessor for your problem would require a substantial amount of work, by redesigning several algorithms, so this is not gonna happen soon (if ever).

I would rather encourage you to take another approach, without using the preprocessor, since actually you don’t really need it: your model is already in matrix form!

I think you should rather directly construct those matrices in MATLAB/Octave, and then apply a first-order rational expectation solver directly on these matrices. There are several toolboxes out there that can do it. And it should even be relatively easy to write it by yourself, using the qz decomposition.

1 Like

Many thanks for looking into this @sebastien! I will revert back to a LRE matlab implementation in matrix form as you suggest.