Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • vce(robust) in mixed model (MLM)

    Hello everyone,

    I hope you are well.

    I'm wondering about the mixed models.

    Here is my order:

    Code:
    mixed Qtobin_w mills centBWOMEN  c.centBWOMEN#i.quota BoardSize  indep CEOChairmanDuality SIZE_w lnAGE leverage_w  ESGScore uncemployement popgrwoth corruption GDPgrowth Dummylegalsystem quota i.Year i.NAICS2digit || CountryofHeadquarters: || entFE:, mle
    Here are the results :

    Code:
    mixed Qtobin_w  centBWOMEN  c.centBWOMEN#i.quota BoardSize  indep CEOChairmanDuality SIZE_w lnAGE leverage_w  ESGScore uncemployement popgrwoth corrup
    > tion GDPgrowth Dummylegalsystem quota i.Year i.NAICS2digit || CountryofHeadquarters: || entFE:, mle
    
    Performing EM optimization ...
    
    Performing gradient-based optimization: 
    Iteration 0:   log likelihood = -52779.705  
    Iteration 1:   log likelihood = -52779.705  
    
    Computing standard errors ...
    
    Mixed-effects ML regression                     Number of obs     =     48,134
    
            Grouping information
            -------------------------------------------------------------
                            |     No. of       Observations per group
             Group variable |     groups    Minimum    Average    Maximum
            ----------------+--------------------------------------------
               CountryofH~s |         40         10    1,203.3     17,582
                      entFE |      7,080          2        6.8         19
            -------------------------------------------------------------
    
                                                    Wald chi2(53)     =    4631.52
    Log likelihood = -52779.705                     Prob > chi2       =     0.0000
    
    ------------------------------------------------------------------------------------
              Qtobin_w | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
    -------------------+----------------------------------------------------------------
            centBWOMEN |   .1165522    .042965     2.71   0.007     .0323423    .2007621
                       |
    quota#c.centBWOMEN |
                    1  |   .2555075    .111079     2.30   0.021     .0377966    .4732183
                       |
             BoardSize |   .0097224   .0016394     5.93   0.000     .0065092    .0129356
                 indep |   .0178116    .027422     0.65   0.516    -.0359345    .0715577
    CEOChairmanDuality |   .0260314   .0101111     2.57   0.010      .006214    .0458488
                SIZE_w |   -.277437   .0065622   -42.28   0.000    -.2902987   -.2645752
                 lnAGE |  -.0843648    .008927    -9.45   0.000    -.1018613   -.0668682
            leverage_w |  -.0647433   .0052194   -12.40   0.000    -.0749731   -.0545135
              ESGScore |   .0028652   .0003365     8.52   0.000     .0022057    .0035246
        uncemployement |  -2.323016   .2430006    -9.56   0.000    -2.799288   -1.846744
             popgrwoth |   4.925222   .9433319     5.22   0.000     3.076325    6.774118
            corruption |  -.1158873    .025226    -4.59   0.000    -.1653292   -.0664453
             GDPgrowth |   .0258892   .0020749    12.48   0.000     .0218224    .0299559
      Dummylegalsystem |   -.091007    .218237    -0.42   0.677    -.5187437    .3367297
                 quota |  -.0210635   .0241421    -0.87   0.383    -.0683812    .0262542
    I specify first || CountryofHeadquarters: then || entFE: // so I specify first the country level then the company level, is that correct?

    Indeed, if I do the opposite, the No of groups remain similar on both levels (country and company).

    Code:
    . mixed Qtobin_w  centBWOMEN  c.centBWOMEN#i.quota BoardSize  indep CEOChairmanDuality SIZE_w lnAGE leverage_w  ESGScore uncemployement popgrwoth corrup
    > tion GDPgrowth Dummylegalsystem quota i.Year i.NAICS2digit  || entFE: || CountryofHeadquarters:, mle
    
    Performing EM optimization ...
    
    Performing gradient-based optimization: 
    Iteration 0:   log likelihood = -53063.497  (not concave)
    Iteration 1:   log likelihood = -53063.497  (backed up)
    
    Computing standard errors ...
    
    Mixed-effects ML regression                     Number of obs     =     48,134
    
            Grouping information
            -------------------------------------------------------------
                            |     No. of       Observations per group
             Group variable |     groups    Minimum    Average    Maximum
            ----------------+--------------------------------------------
                      entFE |      7,080          2        6.8         19
               CountryofH~s |      7,080          2        6.8         19
            -------------------------------------------------------------
    
                                                    Wald chi2(53)     =    4487.85
    Log likelihood = -53063.497                     Prob > chi2       =     0.0000
    
    ------------------------------------------------------------------------------------
              Qtobin_w | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
    -------------------+----------------------------------------------------------------
            centBWOMEN |   .0492723   .0427491     1.15   0.249    -.0345144    .1330589
                       |
    quota#c.centBWOMEN |
                    1  |   .0275504   .1095757     0.25   0.801    -.1872141    .2423148

    Secondly, with regard to vce(robust) robust standard errors, is it really worth adding them, given that robust standard errors don't take intra-group or intra-cluster correlation structure into account?

    Is it better to use the vce(cluster) option? If so, which cluster should I specify? entFE? CountryofHeadquarters?

    Finally, if I include the vce(robust) option in my analysis, my prob>chi2 stat doesn't appear (the stat does appear if I set only i.Year or only i.NAICS2digit, but if I include both in the model, it doesn't appear).

    Thank you very much in advance,

    Loïc Dubois
Working...
X