Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Why does _optimize() issue an error message and abort?

    Dear Statalist,

    I am writing an estimation command and would like to modify the error message that is issued by optimize() when the numerical derivative could not be calculated. In specific, I would like to add a recommendation for the user how to solve the error.

    I've tried to solve my problem by using _optimize(), which is supposed return nonzero values in case of error and not to abort. If there is an error, I could print the error text and a recommendation for the user. However, _optimize() aborts when it "could not calculate numerical derivatives -- discontinuous region with missing values encountered".

    A minimal working example is:

    Code:
    mata:
    mata clear
    
    void myFunction(real scalar todo, real scalar p, score, g, H) {
        score = 2
    }
    
    S  = optimize_init()
    optimize_init_params(S, 0)
    optimize_init_evaluator(S, &myFunction())
    _optimize(S)
    
    if (ec = optimize_result_errorcode(S)) {
        errprintf("{p}\n")
        errprintf("%s\n", optimize_result_errortext(S))
        errprintf("\n Print some recommendation")
        errprintf("{p_end}\n")
        exit(optimize_result_returncode(S))
        /*NOTREACHED*/
    }
    
    result = optimize_result_params(S)
    
    printf("Result is: %5.2f", result)
    
    end
    Why does _optimize() abort? What am I doing wrong?

    I'm using Stata 14 with Windows 7.

    Thank you very much for your help!

    Christoph
    Last edited by Christoph Halbmeier; 16 Jul 2018, 06:39.

  • #2
    Christoph Halbmeier --

    When I try and run your code, I find that it blows up because Stata "cannot calculate numerical derivatives." But if you change optimization technique to one that doesn't require derivatives, I think it works. For example, in the following I switch the method to Nelder-Mead, which is derivative-free:

    Code:
    mata:
    mata clear
    
    void myFunction(real scalar todo, real scalar p, score, g, H) {
        score = 2
    }
    
    S  = optimize_init()
    optimize_init_params(S, 0)
    optimize_init_evaluator(S, &myFunction())
    optimize_init_technique(S,"nm")
    optimize_init_nmsimplexdeltas(S,.1)
    _optimize(S)
    
    if (ec = optimize_result_errorcode(S)) {
        errprintf("{p}\n")
        errprintf("%s\n", optimize_result_errortext(S))
        errprintf("\n Print some recommendation")
        errprintf("{p_end}\n")
        exit(optimize_result_returncode(S))
        /*NOTREACHED*/
    }
    
    result = optimize_result_params(S)
    
    printf("Result is: %5.2f", result)
    
    end
    And the code runs on my machine (giving of course a nonsensical answer). Hope that helps!

    Matthew J. Baker

    Comment


    • #3
      Hello Matthew,

      thank you for your help! Yes, the intention of this nonsensical optimization problem was to produce this error of "cannot calculate numerical derivatives". My real optimization problem produces the error in seldom cases too. And I thought it would be good then to display an error message for the user saying how to circumvent the error. In case of the real optimization problem it helps to choose a different initial value for the optimization.

      But yes, maybe it's better if I use a different optimization algorithm and/or formulate the optimization problem more precisely.

      Comment


      • #4
        Hi Christoph Halbmeier,

        Elaborating on Matthew's suggestion, in many cases a derivative-free optimisation routine would be an advisable solution. However, I think Nelder-Mead is the only derivative-free optimizer currently supported by optimize(), and it tends to perform quite badly in one-dimensional optimization problems, so whether it is a wise solution will depend heavily on the particular problem you're considering.

        Nelder-Mead can also in certain cases by quite a bit slower than the other optimizers available via optimize(), so I wouldn't rule out your initial solution as a good one, particularly if a change in the initial starting point can improve convergence.

        For your code, I don't actually find that _optimize() aborts (on Stata 15) - it may looks like it does because of the printed warning "could not calculate numerical derivatives -- discontinuous region with missing values encountered", but it should run to completion. So the code you've suggested would be perfectly placed to recommend how to proceed. Unfortunately, if you're looking to suppress the warning, I'm not aware of a way to do this.

        Comment

        Working...
        X