Is there a way to model a shock to the standard deviation?

Like, what are the responses of each variable, when standard deviation of a technology process moved from 0.01 to 0.02?

Is there a way to model a shock to the standard deviation?

Like, what are the responses of each variable, when standard deviation of a technology process moved from 0.01 to 0.02?

Maybe just change the standard deviation from 0.01 to 0.02 if you are interested only in the responses of the variables (i.e., IRFs) after a one-period shock.

There is a huge literature on uncertainty shocks, see e.g. https://github.com/JohannesPfeifer/DSGE_mod/tree/master/Born_Pfeifer_2014

So, in the code; you had a shock on the level and also a shock on the volatility.

Is it possible to have a shock only on the volatility? So,

var u_tb; stderr 0

var u_sigma_tb; stderr 1

Not really. In that case the shock, whose standard deviation you are increasing would not exist.

So, if I want to study the effect of a volatility shock to TFP, I would do;

Am I getting this right?

```
y = exp(z) * k^theta * n^(1-theta)
z = rho * z(-1) + e * sigma;
sigma = eta * sigma(-1) + eps * size;
shocks;
var e; stderr 1;
var eps; stderr 1;
end;
stoch_simul(order=3, periods=500, irf=300, pruning, replic=50000);
```

Mostly yes. But

- You are using a log-log-specification, whose moments do not exist. See Andreasen (2010)
- I would recommend against GIRFs. See https://github.com/JohannesPfeifer/DSGE_mod/blob/master/Basu_Bundick_2017/Basu_Bundick_2017.mod for an alternative.