Shock_decomposition with use_shock_groups, scale_file

  1. I want to use shock_decomposition with option use_shock_groups in order to get graphs with groups I specify.
    I am writing codes similar to example of manual of dynare 4.5.0:

shock_groups(name=group1); supply = e_a, e_b; 'aggregate demand' = e_c, e_d; end; shock_decomposition(use_shock_groups=group1);

but I get graphs with shocks of varexo and I would hope to get it with my groups. I attach my files.

  1. I want to use option scale_file with my file named prometheo_optimal_mh_scale. In order to do this i put:
posterior_sampler_options = ('scale_file',prometheo_optimal_mh_scale) 

inside estimation command, but I get error message: syntax error, unexpected NAME

  1. Also, several varobs have trend with mean cero. My question is if is better use stationary variables?, in my case, my varobs in first difference.

  2. I attach an imagen .png of output with error after run my .mod in matlab 2016 and matlab 2017. Could you please tell me what could be my error?
    imagen.rar (200 KB)
    mymod.rar (145 KB)

  1. I still need to look at this in detail.
  2. It should be

i.e. a quoted string
3. Please elaborate. I do not understand the point.
4. In your steady state file, you already have

mu = 0; % These initializations are needed because mu,beta and alpha are fonctions of a beta = 0; % matlab toolbox! alpha = 0; %
The same applies to sigma as well. Thus, add

sigma=0;
here.

  1. I think data_ldiffU still has a big seasonal pattern.

Thanks a lot dear jpfeifer.

Thanks again, I will hope.

  1. I have some observed variables, especifically, interest rate of household loans, loans to entrepreneurs and deposits rate and probability of default (Delinquency), which have trends. These variables are related to equations of my model in a correct form, so my question ins´t how to relate it, but rather I am not sure if it is correct to include data with trends.

¿What would be advantage of include these variables in firt differences? Should I remove trends using some filter like HP?

A million thanks

Cheers

Aldo

  1. The first issue is a bug in 4.5. Use
options_.plot_shock_decomp.use_shock_groups='group1';

to manually set the option.

Thanks dear jpfeifer,

Please don’t forget my another question:

[quote]3. I have some observed variables, especifically, interest rate of household loans, loans to entrepreneurs and deposits rate and probability of default (Delinquency), which have trends. These variables are related to equations of my model in a correct form, so my question ins´t how to relate it, but rather I am not sure if it is correct to include data with trends.

¿What would be advantage of include these variables in firt differences? Should I remove trends using some filter like HP?[/quote]

Thanks

What do you mean with “trend”? Interest rates usually should be stationary.

I am studying peruvian economy where there has been an increase in bank competition and my data of interest rates of loans and deposits has been decreasing since 2002, when I plot these, I find negative slopes in the adjusted lines. The same goes for delinquency rate (which I use as proxy of probability of default).

I am thinking remove these slopes (“trends”) using some filter or using first differences.

I attach these data as a sample.
data_sample.rar (21.8 KB)

There is no good answer here. It very much depends on what you are trying to model. There is indeed a downward trend in these data where the model tells you they should be stationary. Thus, if you do not remove the trend, you will force the shocks to account for the trend. That trend may be related to a decrease in inflation rates for example. Selectively taking out some trends in the data will purge the underlying effects from these series, but there may be undesirable interactions. The low interest rates may have consequences for output, but there you do not remove a commensurate trend. Central banks therefore are currently adding features to their model to allow the model to endogenously account for such downward trends instead of trying to filter the data.

Thanks a lot dear jpfeifer.

Cheers

Aldo