Hello, I am trying to check the fit of my model to data. I calibrated a dsge model and I run it without log-linearizing the equations. As for the real data I quadratically detrended it after taking logs of the series and then I calculated the cycle as percent desviation from the trend. So I would like to know how to compare the moments of these series with the theoretical moments showed by the model, particularly the standard desviation.

That is a matter of taste. Some people compare the model objects directly to the preprocessed data, thereby taking the position that their stationary model has to explain all the frequency components left in the filtered data.

I personally prefer to treat the model variables exactly as the data for comparisons. As you are working with theoretical moments, you do not need to worry about deterministic trends as they are not there in the model. You only need to make sure you are comparing logged model variables to logged data.

I still have two more questions (I am a beginner with dsge):

As I didn’t log-linearized my model the mean of the variables are not zeros (as expected). So should I divide the standard desviation by the mean to have it in percent in order to compare with the standard desviation of data?

Should I interpret the irf as percent desviation from steady-state values?

No. Variables not in logs are usually not in percent and thus cannot be interpreted as percentage deviations. There are several ways out. I would suggest option 3 in