# Bayesian estimation result about the parameter in Taylor Rule

#1

Dear Professor Pfeifer,

I use the Taylor Rule as follow in the model, in which rhoRUU means the Monetary Policy inertia and rhoYUU the interest rate response to output growth.

RU = rhoRUU * RU(-1) + (1-rhoRUU)( log(RUU) + rhopiUU(piU(+1)-log(piUU))
+ rhoYUU*( YU-log(YUU) ) ) + e_R;

But the when I doing the Bayesian estimation, the result displays that the posterior mean of rhoRUU is 0.2 (prior mean is 0.75) and the posterior mean of rhoYUU is 0.8 (prior mean is 0.1). Is this result very strange or just acceptable ?How does this happen?

Any reply will be appreciated. Thank you very much!

#2

Why are you surprised by these results? Because of the level of the estimates per se or because of the gap between the posterior and prior expectations? The discrepancy between the prior and posterior moments means that you learn something from the data (likelihood) about these parameters. If you really believe that the parameters should be closer to the prior expectations (for whatever reasons) you should reduce the prior variances.

Best,
Stéphane.

#3

Thank you for your reply ! I really appreciate that.

I am surprised because I’ve never seen such a big value of rhoYUU before in any papers (normally it’s [0.1, 0.3]), so I wonder if I did something wrong.

#4

I agree that this parameter is often found smaller, but this must depend a lot on the data and the rest of the model. So it is difficult to say. Again if you have strong evidence that this parameter should be smaller, you should reduce the prior variance of rhoYUU.

Best,
Stéphane.

#5

My hunch would also be that there is something strange. But usually that shows up in more than one parameter. Are there other parameters that are weird? And did you look at e.g. IRFs to see whether the model has strange implications? According to my experience, things like this can happen when the data treatment in the observation equations was wrong, e.g. mixing up net and gross interest rates as well as the annualization.