Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • firthlogit


    If I have a small and highly unbalanced sample, is it correct to use firthlogit instead of simple logit?
    thnaks to everybody

  • #2
    Well, ordinary (maximum likelihood) logistic regression works poorly with small samples and unbalanced outcome distributions. -firthlogit- is one alternative, and is probably the best in most circumstances. If your data set is extreme in its smallness and imbalance, you might also consider exact logistic regression (help -exlogistic-). While -exlogistic- is, in principle, suitable for any logistic model, its use is usually restricted to small data sets because in even moderate size ones the memory and compute time burdens make it impractical or infeasible.

    Comment


    • #3
      I ll try also bootstrap with 5000 replication
      but the threshold of valid replications did not exceed 80%. Is this threshold correct? Thanks a million

      Comment


      • #4
        I would not use the bootstrap in this case. In your other questions you said you hat 54 observations with only four 1s and fifty 0s. That is a prime case for either firthlogit or exlogit. It solves the actual problem that you have: it is hard to even get point-estimates in regular/maximum-likelihood logit model with such a small size. You have seen that when you saw all those invalid replications when you ran the bootstrap. The bootstrap assumes you can get point-estimates, but have trouble estimating the sampling distribution for those point-estimates (and thus confidence intervals). So the bootstrap does not solve the problem you have.
        ---------------------------------
        Maarten L. Buis
        University of Konstanz
        Department of history and sociology
        box 40
        78457 Konstanz
        Germany
        http://www.maartenbuis.nl
        ---------------------------------

        Comment


        • #5
          thank you

          Comment

          Working...
          X