I am on a steep learnign curve for Bayesian estimation. I am trying to compare the performances of my DSGE model with competing BVAR models.
For DSGE I get:
ESTIMATION RESULTS (from DSGE)
Log data density is 399.099665.
For BVAR models up to 8 lags the marginal log density ranges from 875.2799 to 1016.
What can I conclude about the performance of my DSGE vis-a-vis BVAR? Is my DSGE better or worse than competing BVAR?
i am confused because I am expecting to see a negative marginal likelihood for DSGE and BVAR as in Smets and wouters classic paper but I only get posotive logdata density. How do I interpret these numbers?
I use this command for BVAR. Am I missing any option?
estimation(datafile = data4_QE, mh_replic=3000, mh_jscale = 1.3,
first_obs=9) x, lg, lh, cy, iy, gy ygrow,
hgrow, kgrow, infl, ib, kh, rk, ret, dep_y, mb;
Thanks in advance for any help.
They have the same interpretation. There is no reason the numbers should be negative. Assume you only have one observation which is normally distributed with mean 0 and standard deviation 0.1. Now you are evaluating the log marginal data density. The described normal density has a mode at 0 with a pdf of 3.9894. The log of this value is obviously bigger than 0. Summing many of those observation can result in positive values. If the distribution becomes wider (say standard normal), the pdf mode is smaller than 1 and the log will be smaller than 0.
But all this does not change the monotonicity property of the marginal likelihood. The bigger the better.
Thanks for the reply. Does this mean that my DSGE model is not passing the
diagnostics because its likelihood is worse than bvar? My understanding is
that DSGE model usually has worse likelihood because it imposes too much
restrictions on the data. What is then the right diagnostic for DSGE? Should I
Then stick to Brook Gelman statistics?