Colleagues,
I have written an MLE routine that is quite time intensive. I've improved it as best I can, but I have recently realized that estimating the variance-covariance matrix consumes about half of total the run time. I'm wondering if this is unusual.
Here's a little more detail: The model being estimated has N parameters, one for each of a sample of N observations. (It's an obscure model, whose details I will omit here.) What is relevant, I think, is that I am estimating it in Mata with -optimize()- , with a d1 LL evaluator written completely in Mata. I obtain the variance-covariance matrix in Mata with
Now, I understand that computation time for V must be at least O(N^2), so I'm not surprise that this routine gets quite slow as N gets large. But there are calculations in the LL evaluator that involve multiplication of N X N matrices, so I'm surprised that calculating V is taking as much time as (say) 3-4 calls to the LL evaluator. Is this indicative or something weird, or is this just what I should expect from the expense of calculating and inverting the Hessian?
Regards, Mike
I have written an MLE routine that is quite time intensive. I've improved it as best I can, but I have recently realized that estimating the variance-covariance matrix consumes about half of total the run time. I'm wondering if this is unusual.
Here's a little more detail: The model being estimated has N parameters, one for each of a sample of N observations. (It's an obscure model, whose details I will omit here.) What is relevant, I think, is that I am estimating it in Mata with -optimize()- , with a d1 LL evaluator written completely in Mata. I obtain the variance-covariance matrix in Mata with
Code:
V = optimize_result_V_oim(S)
Regards, Mike
Comment