Question regarding the measurement equation in Smets and Wouters (2007)

Hi all,

I am puzzling with the measurement equation associated with labor supply in Smets and Wouters (2007). The measurement equation (in dynare) is specified as:
labobs = lab + constelab
on the left hand side of this measurement equation, the data for hours worked “labobs” has been demeaned . On the right hand, there is a term “constelab” for the steady state value of labor supply.

I have two questions:
1, I thought we should choose to either demean the data or add steady state value of model variables, but should not do both simultaneously. Am I right?
2, how to interpret the term “constelab”?

Thanks in advance!

Best wishes!

You can verify in their data description that the hours data has not been demeaned.

Hi jpfeifer,

Thanks a lot for your help. It is so kind of you! But I checked the data description in SW(2007) again, it seems hours data has indeed been demeaned.

I am looking at the excel file “usmodel_data.xls” provided in their online data appendix. Column “z” headed with “labobs” is the actual data used for their estimation, which is the term specified in their measurement equation for labor supply. Yet, column “z” is generated by subtracting column “Q” with “Q$238”. And “Q$238” is simply the average of column “Q”, that should be the mean of hours . For that, I thought hours data has been demeaned. Maybe I am wrong?

Thanks again!

No, you are right that hours was demeaned, but before restricting the sample. That may explain why they use


i.e. center the “steady state” coefficient around 0 instead of the usual values.

Hi jpfeifer,

Thank you so much for your help!

You are totally right. The steady state of hours worked “constelab” is specified as a estimate with a normal prior distribution centring around 0. That might already take into account the fact that hours worked are demeaned.

I have three follow up questions:

1, regarding the strategy of “demean”, take “hour worked” in SW(2007) for example, my thought is: to map the data of hours worked with hours variable (in log deviation) in the model, we may want to first take the mean of data series and then do log difference, as in the following equation:

[100log(h_t^data) – 100log( MEAN(h_t^data) ) ] = h_t^{hat}

where “h_t^data” refers to the data series, “MEAN(h_t^data)” is to take mean of the raw data series of hours worked directly. “h_t^{hat}” denotes the variable (in log deviation) in the linearized model.

However, I saw many studies, including SW(2007), demean in different way: “demeaning” is specified as in this equation:

[ 100log(h_t^data) –MEAN(100log(h_t^data) ) ] = h_t^{hat}

that is to take mean after taking “log” and scaling by 100, and then demean. It seems this approach is more appealing, but I have a hard time to understand why.

Could you help to understand why the second approach make more sense?

2, In SW(2007), hours worked is the only variable whose steady state is estimated, i.e. “constelab”. Why the variables “hours worked” is so special? Why do they want to make the steady state of hour worked “constelab” stochastic?

3, In SW(2007), I checked that the steady state value of hour worked “constelab” affects the model dynamics only through the measurement equation “labobs = lab + constelab”. How important is it to estimate the steady state of hour worked “constelab”? How important it is for this estimate to determine the estimation results and the induced model dynamics?

Thanks for your time!


  1. The reason is Jensen’s Inequality. The mean of a nonlinear function is not equal to the same nonlinear function applied to the mean. If you demean first and then take the log, you resulting series will not be mean 0
  2. That is not true. All steady state values for the observables are estimated, e.g. \bar \pi and \bar \gamma. The general problem is that hours worked in the model are not as easily mapped back to the data as other variables. In the model, we usually think about hours as being a share of an endowment, i.e. something unitless, but in the data, we only see hours worked, i.e. measured in hours of time. To bring the two together, SW go for a percentage deviation from the mean, where the mean is supposed to be the steady state.
  3. You would have to test this. My hunch is that this is pretty important. The issue is that SW use a non-separable utility function and estimate things like fixed costs, which are going to affect the steady state. For that reason, there will be an interaction effect between e.g. the fixed costs and the estimated steady state for labor. Fixing it to a particular value may lead to very different and potentially wrong results.

Hi jpfeifer,

Thank you so much for your help!Now I have better understanding of these issues with the help of your comprehensive explanations.

1, Now I understand why the second strategy is more appealing. I would demean after taking log and scaling by 100 as you suggested.

2, Sorry for my mistake. You are totally right. In fact, all steady date values of the seven observables are estimated. I get your point: introducing the term “consterlab” in the measurement equation of hours worked is meant to capture the mismatch between unit of measuring hours worked in the model and the unit of measuring observed hours worked in the data. Can we think of this as a type of measurement error, even thought hours worked in the data are indeed correctly measured. I mean, instead of pursuing a percent deviation from the mean as in SW, can we alternatively introduce measurement error in the measurement equation of hours worked? Are these two specifications equivalent?

3, Following your suggestion, I have tested the estimation with and without the steady state of hour worked “constelab” in SW model. The “.log” files is enclosed. “usmodel.log” usmodel.log (35.4 KB)
documents the replication of mode estimation in SW(2007), while “usmodel_noconstelab.log” usmodel_noconstlab.log (36.0 KB) documents the mode estimation in the absence of “constelab”. It turns out that the estimated modes of other estimates do vary across these two cases, though not as significant for most of them. Maybe the variation is strong enough to affect model dynamics. But I think I understand the message you try to convey: the interplay between “constelab” and other specification in the model, such as fixed production cost or non-separable utility function, might notably influence the model steady state and thus model dynamics.

Many thanks!


  1. I would not call this a measurement error and using one is clearly not equivalent. A measurement error is something varying over time.

Hi Johannes,

I totally agree with you, “constelab” is time invariant and is not equivalent to a measurement error.

Many thanks for all your help!