Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Specificity/Sensitivity Tables after using Firth's method for Penalized Logistic Regression

    I have a dataset with 1,190 participants and am trying to predict a low frequency event (n = 91) at follow-up using 8 predictors that showed significant differences/correlations at baseline. Using logistic regressions, two of my predictors are significant and the overall model is significant, but the model is 0% sensitive - it couldn't predict a single person with the event at follow-up. It is 100% specific, but since so many people did not experience the event, I'm not all that excited. I tried using firthlogit and firthfit to account for the rarity of the event, but am not sure how to get these classification tables (estat classification is not the correct way to obtain these in firthlogit). Is there any way to get the post-estimation indices after firthlogit (including the classification statistics, ROC curve, and specificity/sensitivity plots?). Thank you!

  • #2
    Originally posted by Lourah Kelly View Post
    Is there any way to get the post-estimation indices after firthlogit (including the classification statistics, ROC curve, and specificity/sensitivity plots?).
    Take a look at the ancillary do-file named "SEMatch.do" that accompanies the user-written command -firthlogit-, which you can download from SSC. Its purpose is to show how to match regression coefficient standard errors that other softwares' Firth logistic regression commands show. But you can use the same tactic to get anything (any postestimation command, including -margins-) that is available after the official Stata -logit- or -logistic-.

    Using a toy example, I illustrate the tactic below for all of your requested postestimation indexes and graphs.

    .ÿ
    .ÿversionÿ16.0

    .ÿ
    .ÿclearÿ*

    .ÿ
    .ÿquietlyÿsysuseÿauto

    .ÿ
    .ÿfirthlogitÿforeignÿc.mpg,ÿnolog

    ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿNumberÿofÿobsÿÿÿÿÿ=ÿÿÿÿÿÿÿÿÿ74
    ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿWaldÿchi2(1)ÿÿÿÿÿÿ=ÿÿÿÿÿÿÿ8.81
    Penalizedÿlogÿlikelihoodÿ=ÿ-35.042846ÿÿÿÿÿÿÿÿÿÿÿProbÿ>ÿchi2ÿÿÿÿÿÿÿ=ÿÿÿÿÿ0.0030

    ------------------------------------------------------------------------------
    ÿÿÿÿÿforeignÿ|ÿÿÿÿÿÿCoef.ÿÿÿStd.ÿErr.ÿÿÿÿÿÿzÿÿÿÿP>|z|ÿÿÿÿÿ[95%ÿConf.ÿInterval]
    -------------+----------------------------------------------------------------
    ÿÿÿÿÿÿÿÿÿmpgÿ|ÿÿÿ.1507509ÿÿÿÿ.050779ÿÿÿÿÿ2.97ÿÿÿ0.003ÿÿÿÿÿ.0512259ÿÿÿÿ.2502759
    ÿÿÿÿÿÿÿ_consÿ|ÿÿ-4.160968ÿÿÿ1.169985ÿÿÿÿ-3.56ÿÿÿ0.000ÿÿÿÿ-6.454097ÿÿÿ-1.867839
    ------------------------------------------------------------------------------

    .ÿ
    .ÿ*
    .ÿ*ÿBeginÿhere
    .ÿ*
    .ÿtempnameÿB

    .ÿmatrixÿdefineÿ`B'ÿ=ÿe(b)

    .ÿ
    .ÿquietlyÿlogitÿforeignÿc.mpg,ÿasisÿiterate(0)ÿfrom(`B',ÿcopy)ÿnolog

    .ÿ
    .ÿlroc

    Logisticÿmodelÿforÿforeign

    numberÿofÿobservationsÿ=ÿÿÿÿÿÿÿ74
    areaÿunderÿROCÿcurveÿÿÿ=ÿÿÿ0.7286

    .ÿ
    .ÿestatÿclassification

    Logisticÿmodelÿforÿforeign

    ÿÿÿÿÿÿÿÿÿÿÿÿÿÿ--------ÿTrueÿ--------
    Classifiedÿ|ÿÿÿÿÿÿÿÿÿDÿÿÿÿÿÿÿÿÿÿÿÿ~Dÿÿ|ÿÿÿÿÿÿTotal
    -----------+--------------------------+-----------
    ÿÿÿÿÿ+ÿÿÿÿÿ|ÿÿÿÿÿÿÿÿÿ6ÿÿÿÿÿÿÿÿÿÿÿÿÿ5ÿÿ|ÿÿÿÿÿÿÿÿÿ11
    ÿÿÿÿÿ-ÿÿÿÿÿ|ÿÿÿÿÿÿÿÿ16ÿÿÿÿÿÿÿÿÿÿÿÿ47ÿÿ|ÿÿÿÿÿÿÿÿÿ63
    -----------+--------------------------+-----------
    ÿÿÿTotalÿÿÿ|ÿÿÿÿÿÿÿÿ22ÿÿÿÿÿÿÿÿÿÿÿÿ52ÿÿ|ÿÿÿÿÿÿÿÿÿ74

    Classifiedÿ+ÿifÿpredictedÿPr(D)ÿ>=ÿ.5
    TrueÿDÿdefinedÿasÿforeignÿ!=ÿ0
    --------------------------------------------------
    SensitivityÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿPr(ÿ+|ÿD)ÿÿÿ27.27%
    SpecificityÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿPr(ÿ-|~D)ÿÿÿ90.38%
    PositiveÿpredictiveÿvalueÿÿÿÿÿÿÿPr(ÿD|ÿ+)ÿÿÿ54.55%
    NegativeÿpredictiveÿvalueÿÿÿÿÿÿÿPr(~D|ÿ-)ÿÿÿ74.60%
    --------------------------------------------------
    Falseÿ+ÿrateÿforÿtrueÿ~DÿÿÿÿÿÿÿÿPr(ÿ+|~D)ÿÿÿÿ9.62%
    Falseÿ-ÿrateÿforÿtrueÿDÿÿÿÿÿÿÿÿÿPr(ÿ-|ÿD)ÿÿÿ72.73%
    Falseÿ+ÿrateÿforÿclassifiedÿ+ÿÿÿPr(~D|ÿ+)ÿÿÿ45.45%
    Falseÿ-ÿrateÿforÿclassifiedÿ-ÿÿÿPr(ÿD|ÿ-)ÿÿÿ25.40%
    --------------------------------------------------
    Correctlyÿclassifiedÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ71.62%
    --------------------------------------------------

    .ÿ
    .ÿlsens

    .ÿ
    .ÿexit

    endÿofÿdo-file


    .

    Comment


    • #3
      Thank you! This is incredibly helpful - unfortunately I still have 0% sensitivity with my models, but I can see the tables. I appreciate your help!

      Comment

      Working...
      X