Compare VAR IRF of original series and percentage change of series

Dear friends,

Sorry that this topic might not be quite correlated with Dynare. I’m confused what’s the real difference between IRfs of original Series and of percentage change/deviations.

Does DSGE model more emphasize on percentage change of original series? Thank you. Best,

What exactly is your question? Im most DSGE models the levels of the variables can be scaled arbitrarily, rendering the levels meaningless. Percentage deviations are invariant to such scaling. Think about measuring output in euro or dollars. The levels would differ according to currencies, but the IRFs in percentage deviations would be the same.

Thank you to professor Johannes. Well, that’s what I am confused. Sorry that I should write that, is it also meaningful to estimate a VAR using variables of log level, like logarithm of employment rate? Thank you.

For VARs, it is more about fitting empirical relationships. Employment rates are already rates, i.e. in percent. For that reason, we usually do not take the log again. Other variable that show exponential growth need to be logged to turn them linear in logs.

Thank you so much for your kind reply. It might be hard to attain this information from class for juniors. And, in this case, IRF in models might usually be in percentage devious, while empirical relationships are in log level. Is it reasonable to fit such empirical VAR with theoretical one when their IRFs are in different forms? Thank you again.

I don’t get the last question. If the variable in the VAR is in log-levels, the IRF will show the deviation of the log level from the mean, which is a percentage deviation as in the model.

Thank you for your reply.