Posterior predictive checks

Hi Prof. Pfeifer and to all,

I am trying to compare two models which are estimated using different datasets (different in the sense that one of the models use an extra observable data series which is not included in the other model. Example: model1 is estimated on observable series Y -plus some othe obs- while model2 estimated using observable x and z where x+z = y - and the rest of observables are the same)

I am interested in some model fit tests and read about it the only relevant post. (link below)

My questions are :

  1. Can one run these posterior predictive checks manually by using any of the ‘’ forecast ‘’ commands of dynare ?
  2. Does the new unstable version include any routines that allow to run any posterior predictive checks ?

many thanks in advance !

Which exact posterior predictive check do you have in mind?

Honestly I am not quite sure I am right.
What I have done is:
I have estimated.
I have compared data 2-nd moments to second moments from oo_.var (are these moments based on mode or posterior mean ?)
What I was thinking is based on what I read from a paper saying:

It seams the author is making a forecast of endogenous variables and calculating their CI as well drawing simulations.
Initially I thought I could do that with SMOOTHED variables but they turn to be exactly the same like the observed variables .#
From the documentation of dynare none of forecasts commands includes a ‘‘burn-in’’ option.
Is there a way to do a similar exercise pls?

Many thanks for the replies Professor!

That quote indicates that you have to simulate the model for a given posterior draw and then compute the unconditional second moments. The easiest way to do this is most probably to use the

posterior_function

command and within the posterior function call

stoch_simul

with the respective options for simulation. An example can be found at tests/prior_posterior_function · master · Dynare / dynare · GitLab

Many thanks for the guide Professor .

I am trying to figure out where the ''stoch_simul ‘’ command is placed.
I read the documentation for the posterior function in ‘’ 9 Dynare misc commands ‘’ section of ‘‘dynare.pdf’’, quoting below:

It does not say that ''stoch_simul ‘’ is a command WITHIN this command.

Also from the example you mentioned in the link in ‘‘GitHub.com’’ :

… the ‘‘stoch_simul’’ is not used WITHIN the ‘‘posterior_function’’ command.

So The way I understand your advise is to use the ''stoch_simul ‘’ command AFTER the ‘‘posterior_function’’ command (not WITHIN)

. i.e

Did I get it right Professor ?
OR should I insert the ‘‘stoch_simul’’ command INSIDE the

?
Many thanks again for your time Professor!

Inside of the

posterior_function_demo

you would need something along the lines of

% set the parameters draws to the model structure
M_ = set_all_parameters(xparam1,estim_params_,M_);
% compute the steady state for the parameter draw written to M_
options_.noprint=1;
options_.drop=100;
options_.periods=300;
var_list=char('gp_obs','gy_obs');
[ys, oo_] = simult(oo_.dr.ys,oo_.dr,M_,options_,oo_);
oo_=disp_moments(ys,var_list,M_,options_,oo_);

%set second part of output cell
output_cell{1,1}=oo_.mean;

where you have to adjust options_.periods
to

options_.drop

plus the number of data points you want to simulate. Moreover, you need to adjust the statistic you want to read out. In the example, it is the mean.

Well,

I would

options_.drop=any number similar to 100
options_.periods=equal = the number of data points I want to simulate. (to replicate my observables)

for example:

I hope I got you right>
I still do not see where the ‘‘stoch_simul’’ command is written ?

Many thanks again

Yes, you got that right. And no, you don’t need a stoch_simul command here, because I extracted the relevant lines of code doing the simulation and moment computation. Please be aware that this piece of code will not work if you use the

loglinear

option

Many thanks for your contribution Professor.
I am not quite sure I have done it right though I basically COPIED the command lines you mentioned INSIDE the file ‘’ posterior_function_demo.m ‘’ ( ‘‘posterior_function_demo.m’’ file ATTACHED)

I gave it a try by loading around 160,000 MH replications that I had done before and now triggering the ‘‘posterior_function_demo.m’’ file
by the command

which comes AFTER the ''estimation(…) ‘’ command.

Although I get some CI (confidence intervals) I do NOT get any simulated endogenous variables.
Also I get the following error:

Clearly I have NOt implemented it the right way…
Could I be doing something wrong ?
posterior_function_demo.m (4.22 KB)

Please provide all files.

Professor

the zipped folder is way too big (ca. 700 MB) due to the metropolis folder .
It would NOT not get uploaded.
GMAIL would not take anything greater than 25MB.

I have deleted the ‘‘metropolis’’ folder and the *_results.mat folder.

So you may be able to do some 3000 mh-replications and hopefully not get much time.Many thanks again!

I was not able to replicate the crash when I ran 3000 draws. Rather, everything worked as expected. In
oo_.posterior_function_results
I get the stored variable means from the simulation for each draw.

Many thanks Professor.

  1. It works for me as well now. I think it required that I run the Mh-replications from scratch rather than download existing ones as I did yesterday.

  2. Regarding the output. I am trying to understand the output I get in

I am getting 500 cells each with 9 values inside (Probably this means 500 draws (times) the 9 observables I specified in

Is my reading of results right pls ?

  1. If so, What kind of interpretation can I make ? Does it mean that I can draw the confidence intervals myself based on these 500 draws for each observable (so I get their average -mean- and Confidence intervals ) ?

  2. Don’t I get simulated variables as specified in the option ?

  1. DO I get a Confidence Interval ?

I am struggling to understand : what is the output I am getting ?

What you get in

oo_.posterior_function_results

is the output provided by

posterior_function_demo.m

i.e.

output_cell{1,1}=oo_.mean;

This is the mean of the variables specified in

var_list

in that function. If you need other output, you have to adjust that function. Simulated series for example are in

ys
  1. Yes, you can compute the credible set yourself using that output. Note that it’s not a confidence interval, because this is a Bayesian and not a frequentist exercise.

Hi Professor Pfeifer,

Earlier you had helped with the file ‘’ posterior_function_demo.m ‘’
to simulate the endogenous variables with the estimated parameters.
It was quite helpful as I could use it finally. thank you for that.

Is there a way to obtain the second moments of HP-FILTERED variables instead ? by using the option:

inside the ‘’ posterior_function_demo.m ‘’ file. It did not work in when I tried !!?? .

Many thanks in advance!

Hi Professor Pfeifer !
As a replacement to my last post I have a slightly different thought!

As an update have been thinking of using the ''posterior_demo_function.m ‘’ by combining the posterior distribution from two differnt estimations but
(with my not-so-good knowledge) I have not come up with the solution.

What is in my mind is that I use the draw a sample of i.e.100 simulated series but I want the the posterior for some (one or two) parameters to come from a different estimation
rather than from my baseline estimation . let’s say a different ‘’ M_ = set_all_parameters(xparam1,estim_params_,M_); ‘’ from another estimation

For example I like the parameters ‘‘alpha’’ and ‘‘beta’’ to come from a different estimation_1. ‘’ M_ = set_all_parameters(xparam1,estim_params_,M_); ‘’ so that I overwrite
the mean, mode and credible interval that I have in my baseline estimation_0 for these two parameters .

Is it possible by manipulating the ‘’ M_ = set_all_parameters(xparam1,estim_params_,M_); ‘’ ?

Many thanks in advance Professor!

That is not easily possible. The way to go here would be to use the posterior_sampler to return you the posterior parameters from both estimations. Then you need to write your own function looping over these parameters, combining them and then loop over setting them and running the desired command to compute the posterior objects.

Many thanks for your patience Professor !
Following your post, I checked the ‘‘posterior_sampler.m’’ function . As you said it is involving.
It is a good excercise though.

When /and/ if you get some free time may I asK:
what would be the ‘‘TargetFun’’ and ‘‘ProposalFun’’ in this case (from the list below of the inputs of the ‘‘posterior_sampler’’ ).
I am trying to se how I can do the first step only, i,.e (recall the posterior parameters from both estimations)

I did not mean to imply changing that function. That is most probably too involved for you. Rather, that function rather executes the simulation with the

posterior_function_demo

you wrote. What I meant is returning within

posterior_function_demo

the posterior draws

output_cell{1,1}=xparam1;

You can do this for the two estimations you ran separately. That gives you the posterior draws. Afterwards, you can loop over these draws, e.g. [Loop over parameters)

Apologies for the late reply Professor. I have not been around these days.

Before I try looping over them (I have to go over details of the link you gave) - I just want to make clear to myself.
When I draw a sample of (let’s say) 1000 draws using the

(after the estimation command), I basically get 1000 simulated ariables Y,C, etc.
Now my question is am I drawing these draws over the distribution of the estimated parameters or over the distribution of the shocks ?
(I wonder if the question if formulated right !!)
(To mke myself clearer) how does the ‘‘sampler’’ obtain the the 1000 different simulated Y’s that I get, by ‘‘drawing’’
over the distribution of the respective parameter (computed with the MH )?
or over the distribution of the shocks (shock distribution is again computed via MH in this case unlike in a calibrated model ) ???
or over both these (as both the parameters and the size of shocks are considered parameters during the estimation and their distribution is computed via the the MCMH-replic procedure ) ?

ps. I would assume the last one ! Is that right pls ?