In my model, I have an AR(1) exogenous process :
log(x) = (1-\rho)\bar{x} + \rho x(-1) + \sigma*e;
sigma has a value 0.01 and I type stderr 1.
Now, when Dynare gives the simulated moments, I get the moments for x, but not log(x). The STD.DEV. of the simulated series should be close to 0.01. However, I get a different value because I am getting the STD.DEV. of log(x), which in my case is somthing like 0.0005.
How can I see the standard deviation of log(x)? Thank you.
PS. Because of the nature of the model, I cannot transform the AR(1) process into levels. So that straightforward channel is discarded.
Hi,
You can simply add a variable, say logX
and the following equation
logX = log(x);
in the model block. Alternatively, knowing that x
is a log normal random variable, you can deduce the moments of log(x)
from the moments of x
(the formulas of the moments for the log normal are given on wikipedia).
Best,
Stéphane.
1 Like
Stephane, thank you.