Unstable 4.3 identification output contradicatory

I have been using the identification command in the unstable 4.3 version of Dynare (most recently 2011-10-04, but the output for identification seems stable across the various unstable updates I have tried). I am confused that my model is reported as being identified:

==== Identification analysis ====

Testing prior mean
Evaluting simulated moment uncertainty … please wait
Doing 483 replicas of length 300 periods.
Simulated moment uncertainty … done!

All parameters are identified in the model (rank of H).

All parameters are identified by J moments (rank of J)

but that the subsequent report about the colinearity patterns seems to clearly show that this cannot be the case. Here is the output for 1 parameter (which I understand to be the correlations between the columns of J):

Press ENTER to display advanced diagnostics

Collinearity patterns with 1 parameter(s)
Parameter Expl. params ] cosn
e eah ] 0.994
ealp rxss ] 1.000
eeta cp ] 0.999
eah rho_ah ] 1.000
ep phistar ] 0.992
efups rho_ah ] 0.991
efinflf a122 ] 1.000
efi a133 ] 1.000
einv rho_inv ] 1.000
eg ealp ] 0.989
e_l philab ] 0.992
d alpss ] 0.997
del cp ] 1.000
phistar eah ] 0.999
rho_phi d ] 0.982
alpss gam ] 1.000
rho_alp einv ] 0.844
etass g0 ] 0.914
rho_eta cp ] 0.999
rho_ah eah ] 1.000
thep a111 ] 0.898
cp del ] 1.000
rho_inv einv ] 1.000
cpf a133 ] 0.999
rxss gam ] 1.000
usta rho_ah ] 0.076
pif rxss ] 0.078
rho_g rxss ] 1.000
g0 etass ] 0.914
philab e_l ] 0.992
a111 cp ] 0.995
a122 efinflf ] 1.000
a133 efi ] 1.000
gam rxss ] 1.000

I see many pairs of parameters with (near?) perfect colinearity. This surely implies that the matrix J is not of full column rank? When I look at the smallest eigenvalues they are also suspect. The smallest one is 2.65e-7, which looks alot like zero to me. [The largest eigenvalue is 2756.]

Can I trust the output from identification? Or perhaps I do not understand something, and there is no inconsistency. I suppose the collinearities of 1.000 above are not perfect collinearties, but rounded values, but still…

I also have another related question: in Dynare 4.1.3 there were box and whisker plots for these collinearities. But in 4.3 unstable I don’t know what the value is that is being reported. Is it the mean of the box and whisker plot, or the maximum value, or some other value? Or is it not based on the Monte Carlo sample, but based on the prior means of the parameters?

Any help would be greatly appreciated.

Thank you.

Sincerely,
Rob Luginbuhl

Hi,

the rank test checks the “numerical” singularity of the Jacobian. I agree that an eigenvalue of 1.e-7 is ‘suspect’ as you say, but you can invert a matrix with such an eigenvalue. Still this may point to weak identification but not to rank deficiency, which would be otherwise flagged by the rank test.

Another avenue it that the partial correlation measure is only one element of the portrait for identification strength: the latter combines BOTH sensitivity and the measure of collinearity. Even a collinearity of 0.99999 may be non rank deficient if at the same time the model parameter has a very large derivative w.r.t. to such parameter.

The new feature of 4.3 release (unstable) is the addition of the identification strength measure: this is very important and should allow to clarify the meaning of the various tests performed. If such a measure is larger than one, specially w.r.t. the prior uncertainty, then an estimation would still provide some information about the value of such parameters. Of course a large collinearity measure would still point to two parameters that will have a large correlation in the posterior but not to a true rank deficiency.

Thanks a lot for raising this question. I will try to explain this more clearly in the documentation.

best
Marco

Hi Marco,
I have a suggestion for you. As it is now, one part of the identification routine produces two bar charts: the overall identification strength and the sensitivity of the model to the parameters. As I understand it the overall strength is a combination of the sensitivity and the collinearity. We have often had problems with non-convergence of estimates, which appears to be driven by weak identification due to collinearity but not by sensitivity. For those parameters that the model is not sensitive to we simply get our priors back and we get good convergence diagnostics. So from a technical point of view, that is getting reliable, converged posteriors, it seems that the most important component of identification is the collinearity component.

Therefore for increased user friendliness, would it be a good idea to present the output in three figures: one main figure with the overall identification strength and two subfigures showing the two components? Then the user who isn’t getting convergence could first look at which parameters are unidentified or weakly identified, then take a quick look at the third chart to see which of them are the most affected by collinearity. Then the user can use the rest of the output to uncover the source of that collinearity. I know that what I am suggesting is in a sense already displayed, but you have to eyeball the two charts and subtract one from the other, which isn’t always easy to do if you have a big model.

Anyway, it is just a suggestion. The identification package is really useful. So thanks for making this useful additional component of Dynare available.

Adam

Thanks for this suggestion. This seems reasonable and I will think about it in the future developments.
Marco