I am trying to performe an exercise where I want to test the effect of the government subsidy during recession. I want to see what happens if I increase the TFP shock size along the same gov. subsidy shock size. For e.g. gov. subsidy shock size=0.01 and
a) TFP shock size = - 0.01
b) TFP shock size = - 0.03 or more
The problem is whenever I try to use a TFP shock size larger than 0.011 the model fails and the perfect foresight solution can’t be found.
I am using a modified New Keynesian model, i.e., a labor selection model with endogenous firing and hiring and I am solving it as a deterministic model. Workers face heterogeneous operating costs that follow a logistic probability distribution over the interval + and -.
My guess is that the solver fails due to the value of the standard deviation parameter.
Any suggestions on how can I fix this?