Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Why negative binomial regression model and its marginal effects produce same outputs?

    Hi members,

    I'm currently running negative binominal regression models with 3-year data, aiming explore the association between "Number of hospital visits" and "health insurance status”. However, "margins" command produces same results as "xtnbreg" in Fixed-effect and Random-effect models. And then "outreg2" command works in Stata though, it can't export results to word correctly (only results of two RE models and one FE models were recorded ).

    admission_n is a binary count variable stands for the number of hospital visits and has excess zeros.

    . tab admission_n

    admission_n | Freq. Percent Cum.
    ------------+-----------------------------------
    0 | 15,100 88.13 88.13
    1 | 1,592 9.29 97.42
    2 | 273 1.59 99.01
    3 | 91 0.53 99.54
    4 | 30 0.18 99.72
    5 | 25 0.15 99.87
    6 | 11 0.06 99.93
    7 | 1 0.01 99.94
    8 | 1 0.01 99.94
    9 | 2 0.01 99.95
    10 | 1 0.01 99.96
    14 | 1 0.01 99.96
    16 | 2 0.01 99.98
    20 | 1 0.01 99.98
    50 | 1 0.01 99.99
    120 | 2 0.01 100.00
    ------------+-----------------------------------
    Total | 17,134 100.00

    hos_status is a binary variable as well which indicates individuals' hospital insurance status. 0 = without cover, 1 = with cover.

    hos_status | Freq. Percent Cum.
    ------------+-----------------------------------
    0 | 8,912 52.21 52.21
    1 | 8,156 47.79 100.00
    ------------+-----------------------------------
    Total | 17,068 100.00

    And other variables are socioeconomic confounders which are required to be controlled.
    My codes are as follows:

    xtnbreg admission_n hos_status all_expenditure $xvars, re i(waveid)
    margins, dydx(*) post
    outreg2 using table8.doc,replace dec(3) ctitle(RE model) label
    xtnbreg admission_n hos_status all_expenditure $xvars_fe, fe i(waveid)
    margins, dydx(*)
    outreg2 using table8.doc,append dec(3) ctitle(FE model) label

    I wonder if I used wrong count data model or command. Any help would be appreciated. Thanks in advance!


    Here is one example of theFE model

    . xtnbreg admission_n hos_status all_expenditure $xvars_fe, fe i(waveid)
    note: 4441 groups (4441 obs) dropped because of only one obs per group
    note: 3966 groups (9378 obs) dropped because of all zero outcomes
    note: edu_missing omitted because of collinearity

    Iteration 0: log likelihood = -1483.4742
    Iteration 1: log likelihood = -1419.8772
    Iteration 2: log likelihood = -1419.5117
    Iteration 3: log likelihood = -1419.483
    Iteration 4: log likelihood = -1419.4773
    Iteration 5: log likelihood = -1419.4761
    Iteration 6: log likelihood = -1419.4758
    Iteration 7: log likelihood = -1419.4757
    Iteration 8: log likelihood = -1419.4757

    Conditional FE negative binomial regression Number of obs = 3,246
    Group variable: waveid Number of groups = 1,309

    Obs per group:
    min = 2
    avg = 2.5
    max = 3

    Wald chi2(15) = 164.57
    Log likelihood = -1419.4757 Prob > chi2 = 0.0000

    -------------------------------------------------------------------------------------
    admission_n | Coef. Std. Err. z P>|z| [95% Conf. Interval]
    --------------------+----------------------------------------------------------------
    hos_status | .3485832 .1225072 2.85 0.004 .1084735 .5886928
    all_expenditure | -2.13e-07 1.48e-06 -0.14 0.886 -3.11e-06 2.69e-06
    hgage | .0415983 .0760072 0.55 0.584 -.107373 .1905696
    age_squared | -.0012184 .0010812 -1.13 0.260 -.0033376 .0009007
    total_hinc | 1.24e-07 6.59e-07 0.19 0.850 -1.17e-06 1.42e-06
    Defacto | -.5740833 .1318192 -4.36 0.000 -.8324442 -.3157225
    SeparatedDivorced | -.3016688 .1823967 -1.65 0.098 -.6591598 .0558221
    Widowed | -.2946478 .5885062 -0.50 0.617 -1.448099 .8588031
    NeverMarriedDefacto | -.9767331 .1707227 -5.72 0.000 -1.311343 -.6421227
    marital_missing | 13.00722 578.841 0.02 0.982 -1121.5 1147.515
    Certificate | .272721 .1993311 1.37 0.171 -.1179609 .6634028
    Dipl | .0580765 .2651413 0.22 0.827 -.4615908 .5777438
    Bach | .2827035 .2770802 1.02 0.308 -.2603637 .8257707
    edu_missing | 0 (omitted)
    EmployerSelf | .1366814 .1764405 0.77 0.439 -.2091356 .4824985
    employment_missing | .9640473 .0974532 9.89 0.000 .7730426 1.155052
    _cons | -.5128373 1.345382 -0.38 0.703 -3.149737 2.124062
    -------------------------------------------------------------------------------------

    . margins, dydx(*)

    Average marginal effects Number of obs = 3,246
    Model VCE : OIM

    Expression : Linear prediction, predict()
    dy/dx w.r.t. : hos_status all_expenditure hgage age_squared total_hinc Defacto
    SeparatedDivorced Widowed NeverMarriedDefacto marital_missing
    Certificate Dipl Bach edu_missing EmployerSelf employment_missing

    -------------------------------------------------------------------------------------
    | Delta-method
    | dy/dx Std. Err. z P>|z| [95% Conf. Interval]
    --------------------+----------------------------------------------------------------
    hos_status | .3485832 .1225072 2.85 0.004 .1084735 .5886928
    all_expenditure | -2.13e-07 1.48e-06 -0.14 0.886 -3.11e-06 2.69e-06
    hgage | .0415983 .0760072 0.55 0.584 -.107373 .1905696
    age_squared | -.0012184 .0010812 -1.13 0.260 -.0033376 .0009007
    total_hinc | 1.24e-07 6.59e-07 0.19 0.850 -1.17e-06 1.42e-06
    Defacto | -.5740833 .1318192 -4.36 0.000 -.8324442 -.3157225
    SeparatedDivorced | -.3016688 .1823967 -1.65 0.098 -.6591598 .0558221
    Widowed | -.2946478 .5885062 -0.50 0.617 -1.448099 .8588031
    NeverMarriedDefacto | -.9767331 .1707227 -5.72 0.000 -1.311343 -.6421227
    marital_missing | 13.00722 578.841 0.02 0.982 -1121.5 1147.515
    Certificate | .272721 .1993311 1.37 0.171 -.1179609 .6634028
    Dipl | .0580765 .2651413 0.22 0.827 -.4615908 .5777438
    Bach | .2827035 .2770802 1.02 0.308 -.2603637 .8257707
    edu_missing | 0 (omitted)
    EmployerSelf | .1366814 .1764405 0.77 0.439 -.2091356 .4824985
    employment_missing | .9640473 .0974532 9.89 0.000 .7730426 1.155052
    -------------------------------------------------------------------------------------
    Last edited by Geralt Ji; 16 Apr 2022, 04:45.

  • #2
    If you look at -help xtnbreg_postestimation##margins-, you will see that following -xtnbreg-, the default statistic calculated by -margins- after -xtnbreg- is xb. And in any regression model, the marginal effect on xb is exactly the same thing as the regression coefficient. In the same help file you will also see what alternative statistics you have available. Probably you were hoping for a marginal effect on the number of events or the incidence rates. Because you are using a conditional fixed-effects estimator, those are not possible. You can get marginal effects on nu0 or iru0, which are the number of events or incidence rate conditional on the fixed effect being zero. But that, itself, is probably a meaningless statistic, and at best one would not be able to say to what entities it would apply. These conditional fixed effects estimators have a lot of serious limitations, notwithstanding their popularity.

    Comment


    • #3
      Dear Geralt Ji,

      Clyde already provided excellent advice and I agree that estimating meaningful marginal effects in this context is impossible, That is illustrated here.

      Additionally, I would point out that the FE NB model is not a real fixed effects model and should be avoided. See, for example:

      Guimarães, P., (2008), “The fixed effects negative binomial model revisited.” Economics Letters, 99: 63*-66.

      Best wishes,

      Joao

      Comment


      • #4
        Thanks for Clyde Schechter and Joao Santos Silva 's detailed explanation! It is my first time to deal with over-dispersed count variables. I still wonder if it is feasible to run Negative Binomial Regression with random-effected option (Since marginal effects on nu0 and iru0 are meaningless, does this mean that I can directly interpret coefficients without -margin- command?)or other econometric models.Because I still want to interpret the impacts of the insurance status (binary, 0/1) on the number of hospital visits (with excess zeros),

        Comment


        • #5
          Dear Geralt Ji,

          If that is what you want to do, I would use Poisson with FE because that is very robust. Note that you do not even know if the data are conditionally over-dispersed, which is what matters, so the arguments in favour of the NB model are really week.

          Best wishes,

          Joao

          Comment


          • #6
            Originally posted by Joao Santos Silva View Post
            Dear Geralt Ji,

            If that is what you want to do, I would use Poisson with FE because that is very robust. Note that you do not even know if the data are conditionally over-dispersed, which is what matters, so the arguments in favour of the NB model are really week.

            Best wishes,

            Joao
            Thanks Joao Santos Silva ! I'm confused about the selection of models as the he Poisson model is often restrictive in use. I'm afraid it may not model excessive zeros well and it insists that 'mean = variance = λ'. That's why I chose the NB2 model at the beginning.
            Given these restrictive assumptions, would you mind telling me more details about why Poisson with FE is more suitable in this example?
            And I wonder if it is feasible to run Poisson with RE too? Because I want to compare the results of both RE and FE models.
            I would appreciate your help
            Last edited by Geralt Ji; 19 Apr 2022, 10:18.

            Comment


            • #7
              Dear Geralt Ji

              Please check:

              Wooldridge, J. M. (1999), “Distribution-Free Estimation of Some Nonlinear Panel Data Models,” Journal of Econometrics, 90 , 77–97.

              The assumption that the mean is equal to the variance has little importance in this context. Also, forget about RE models, they rely on very strong assumptions.

              Best wishes,

              Joao
              Last edited by Joao Santos Silva; 19 Apr 2022, 15:16.

              Comment

              Working...
              X