Linear detrended data or demeaned growth rate data?

Dear Johannes,

If I would like to estimate a dsge model with no trend,(all shocks are transitory) using U.S. data, it seems that I can use following ways to detrend data(if not consider one-side hpfiler):

  1. Linear/quadratic/cubic detrend
  2. demeaned growth rate data.

Could I ask :
a. Are both 1 and 2 OK?

b. If 1 is Ok, how to know which is the best detrend method among linear, quadratic and cubic?

c. If use 1, can I directly compare model moments(theoretical moments) with data moments by adding stoch_simul command after estimation command? I mean that this
is Unlike hp-iltered data(you have to use hp_filter=1600 in stoch_simul block).

d. Can 1 and 2 be jointly used to detrend data? Like y c I data use cubic detrending while some others use demeaned growth rate data?

Many thanks in advance,
Kind regards,
Huan

a.) Yes, both are OK.
b.) This is a matter of taste. Most people check whether the data after detrending look OK.
c.) Yes, this is the implicit assumption you are making. The linearly detrended data is equivalent to the stationary data coming from your model. Of course, you could linearly detrend your model variables as well, but most of the time, this will not do anything, because they are not really trending (unless you have really a lot of persistence so that the data can drift from the mean for a long time)
d.) Conceptually, this is not a problem. You can do this. But you might have to defend it to your referees, who might be skeptical that you have two different concepts of trend (of course demeaned first differences will also take out a purely linear trend)

[quote=“jpfeifer”]a.) Yes, both are OK.
b.) This is a matter of taste. Most people check whether the data after detrending look OK.
c.) Yes, this is the implicit assumption you are making. The linearly detrended data is equivalent to the stationary data coming from your model. Of course, you could linearly detrend your model variables as well, but most of the time, this will not do anything, because they are not really trending (unless you have really a lot of persistence so that the data can drift from the mean for a long time)
d.) Conceptually, this is not a problem. You can do this. But you might have to defend it to your referees, who might be skeptical that you have two different concepts of trend (of course demeaned first differences will also take out a purely linear trend)[/quote]

Dear Johannes,

Generally people use “credit spread” data in LEVEL during estimation. Could I ask if it is right to use linear detrend method to deal with credit spread data ?

Thanks in advance,
Huan

That depends on your goal. In general your observable is what you model needs to explain. For stationary variables, we usually force the model to account for all movements. If you think that for some reason there is a trend in the credit spread that should not be explained by the model and its shocks, you can also detrend it. But you need to be able to justify this choice.

Hi Johannes,
I have more questions about data processing:

  1. First difference filter: If we use de-meaned growth rate data for stationary model as mentioned in the post, how about interest rate series? Although we expect it to be stationary, often time this is non-stationary. Can we still take a first difference of interest rate just like output etc? Or, to make such rate variables stationary, can we use linear de-trending or 1-hp filter?
    For moments matching: Is it that we need to calculate the growth series of Kalman filtred estimates and then calculate the second moments to match with data moments?

  2. 1 sided hpfilter: After doing 1-sided hp-filter, we know the series becomes mean zero. I have 91 obs and say, mean of hp-filtered log(Y per capita) is -1.33E-03. Now if I scale the data multiplying by 100, the mean becomes -1.33E-01, which cannot be considered as mean zero any more. Is it because of smaller observation number? That means I have to demean the hp-filtered series to make it truly mean zero stationary. Is it wrong?

Thanks,

Sadia

  1. This is a matter of taste and theory. If you believe your model, then the interest rate is stationary, even if in the short sample you are using you cannot reject the null of a unit root. In that case, from a model perspective the problem is with econometric testing in the data. However, if you are unhappy with using the level, you can still use first differences for the interest rates as long as you are using a correct observation equation, i.e. you are matching the first differences in the data to first differences in the model. A different, but related issue is the the presence of deterministic trend. Such a trend may be due to e.g. an unmodeled drift in the inflation target. In that case, detrending the data suitably before feeding it into the model may be required to make the data consistent with the model.
    The point about moment matching I do not understand. The Kalman filter is about unobserved variables, not observed ones. How would you match the moments of an unobserved state to the moments in unobserved data?

  2. The one-sided HP filter is based on a state space filtering problem. The cyclical component asymptotically has mean 0, but in short samples it can have a mean different from 0. What you report is no reason to worry. On average, your data was 0.1 percent below steady state, which is negligible. Of course you are right that shocks have to account for the mean not being 0 over the sample. So if you strongly dislike the small non-zero mean, you can demean the result. I doubt that it will make a big difference.

Dear Johannes,
Thanks. I’m taking this point

Thank you very much for all the other clarifications too.