I wonder if it is possible for you to answer my following question about the estimation of the parameters in DSGE models.

Due to a debate between me and some of my colleagues about the selection of proper standard deviation of prior for an specific parameter, I claimed that this value for standard deviation is subjective and in a Pdfs such as Gamma distribution, it is not really important to be 0.1 or 0.01 or 0.001. In other word, I claimed that if I could make a Gamma distribution with a standard deviation of 0.001 then it is sufficient and it will not need to be checked for the case of 0.01.

I would be glad if you add a comment on this issue and I wonder if you would give us an example.

Of course a prior is subjective and you can chose anything you like. But you would still need to able to defend your prior to other researchers. Thus, your prior should reflect what we know about a particular parameter. For example, the prior should roughly reflect the uncertainty we have about a particular value of a parameter. It is here where your standard deviation of the Gamma prior of 0.1 or 0.01 or 0.001 matters. 0.001 strikes me as way too narrow, because you are effectively saying you know the true standard deviation to be in a tiny interval around the mean.

With sufficient data it is often advocated to err on the uninformative side for your prior in order to let the data speak.

Thanks so much indeed for your answer

I wonder to know whether there is a relation between Standard deviation of prior and the one we have for my data?

I also want to know if it is possible for me to take my priorâ€™s standard deviation to be small for the case in which I chose my data standard deviation to be small?

The standard deviation of a prior is a measure of your uncertainty about this prior. It has (almost) nothing to do with the level of the standard deviation of the data. If you are sure that your shock standard deviation is narrowly centered around a particlar value (big or small) choose a small standard deviation for the prior. If you are uncertain about the mean of the shock standard deviation, choose a large standard deviation for the prior.

Is the choice of small deviation for prior distribution caused that the prior distribution dominates the likelihood function and as a result it makes the prior distribution and posterior functions to be equal?

Yes. this can happen, although there are other reasons that might also give rise to the described behavior.

How could we say that any of our choice for standard deviation range would be big or small? how could we interpret or know being of big or small range?

Take a look at the prior plots to see how peaked the likelihood is.

From the discussion in this thread, it seems a small standard deviation of the priors raises concerns, but not a big standard deviation. Could there be a problem if you make the standard deviation of the priors big enough to reflect almost zero certainty about the specified prior means?

I read somewhere that you can pluck prior means from other papers. For example, author A estimates NK model for Canada, and he estimates \theta = 0.5, for example. You can use \theta = 0.5 as your prior mean if you have a similar NK model. Does that sound right? If it does, perhaps, it could be extended to prior standard deviations as well?

I typically set prior standard deviations to 0.1 or 0.2â€¦but arbitrarilyâ€¦no matter the prior mean or shape. My only defense, for now, is that I see other people do it tooâ€¦:). For prior means, I look at other papersâ€¦a fair defense?

- Loose priors are often unproblematic, but somewhat defeat the purpose of Bayesian estimation.
- Yes, you can use posteriors from other countries or sample periods as your prior. That applies to both the mean and the standard deviation. What you are not allowed to do is use other studiesâ€™ posteriors from the same country and sample as the prior is supposed to be independent of the data.
- You should always at least plot your prior to see how it looks like. Particularly for beta priors, you can get weird shapes.