Standard error in level model and linear model

Hi,
a quick question.

I calibrate a model in level, and than I linearize it.

As I use the same set of parameters in linearized model,
shouldn’t I get the same/similar standard errors for the variables as I get in my model calibrated in levels ?

If it is true I assume I should have done sth wrong during loglinearization or is it normal to get different moments ?

thanks in advance

No, in levels you get absolute deviations. E.g. output deviations are then measured in output units. When you do log-linearization (which is presumably what you mean), output will be in percentage devations and therefore unit-less.
The standard deviation of the log-linearized model should be 1/(steady state) times the standard deviation of the level.

Thanks for the the reply.

In my level model I was calculating the standard deviation of a variable ‘X’ as

so when I get the STD of ‘‘xx’’ in my level model it should correspond to the STD of my x_hat in my loglinearized .

Shouldn’t it ???

Yes, it should.

Thanks again Prof. Pfeifer

I was tracing back for the source of the problem.
So I simplified into an RBC (both linear and level) and I got the same ST.Errors.

Then I added block to get a NK and I realized that the difference comes in the way I specify th Taylor Rule (TR).

So I am posting both my TRs (level & linear model)

Level TR (following SW2007):

so in my linear model I wrote the TR as :

So I could boil down the problem in the way I have translated the TR (and MP shock) from level into linear model.
I tried different ways of specifying it but I could not get the same St.Errors

Any chance you could spot some problems at a blink of an eye pls ?

Thanks a lot for your time.

Where doe the

come from?

Sorry for the typo.
that is not active as it is set equal to '0’
in both TR rules

How big are the differences? And are you sure the differences come from the Taylor rule itself or the way the interest rate enters the model in other equations?