I calibrate a model in level, and than I linearize it.

As I use the same set of parameters in linearized model,
shouldn’t I get the same/similar standard errors for the variables as I get in my model calibrated in levels ?

If it is true I assume I should have done sth wrong during loglinearization or is it normal to get different moments ?

No, in levels you get absolute deviations. E.g. output deviations are then measured in output units. When you do log-linearization (which is presumably what you mean), output will be in percentage devations and therefore unit-less.
The standard deviation of the log-linearized model should be 1/(steady state) times the standard deviation of the level.

I was tracing back for the source of the problem.
So I simplified into an RBC (both linear and level) and I got the same ST.Errors.

Then I added block to get a NK and I realized that the difference comes in the way I specify th Taylor Rule (TR).

So I am posting both my TRs (level & linear model)

Level TR (following SW2007):

so in my linear model I wrote the TR as :

So I could boil down the problem in the way I have translated the TR (and MP shock) from level into linear model.
I tried different ways of specifying it but I could not get the same St.Errors

Any chance you could spot some problems at a blink of an eye pls ?

How big are the differences? And are you sure the differences come from the Taylor rule itself or the way the interest rate enters the model in other equations?