Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Seemingly Unrelated Poisson Regression with Fixed Effect

    Dear All,

    As is in the title, I want to conduct a SUR model on Poisson regression with fixed effects.

    I've been studying this for a while, but could not find a commonly accepted way to do this.

    Any suggestion on whether this is possible at all? If so, where should I start at?

    Thanks for your time and knowledge in advance.

  • #2
    Follow the link in https://www.statalist.org/forums/for...after-ppmlhdfe

    Comment


    • #3
      Originally posted by Andrew Musau View Post
      Hi Andrew, thanks for the reply. I checked the link you posted, but it seems like there is no answer to that question. Also, my question is different from what the OP has.

      I'm really new to this stuff, could you please be more specific?

      Thank you so much for your time and knowledge.

      Comment


      • #4
        Also, my question is different from what the OP has.
        Look at the datasets in

        Code:
        help xtpoisson
        and then frame your question in terms of a data example.

        Comment


        • #5
          Originally posted by Andrew Musau View Post

          Look at the datasets in

          Code:
          help xtpoisson
          and then frame your question in terms of a data example.
          Hi Andrew, thanks for the reply. I checked the dataset in xtpoisson but failed to find a setting (two dependent variables) that can be used to conduct seeminlgy unrelated regression. Any suggestions?

          Comment


          • #6
            Originally posted by 高佳 View Post
            I checked the dataset in xtpoisson but failed to find a setting (two dependent variables) that can be used to conduct seeminlgy unrelated regression. Any suggestions?
            You can generate a second dependent variable using Stata's random number functions. If the wanted dependent variable is a count variable taking values between 1 and 30, e.g.,

            Code:
            set seed 02132023
            gen outcome2= runiformint(1, 30)
            Then present an example based on this.

            Comment


            • #7
              Code:
              webuse ships, clear
              
              set seed 02132023
              gen outcome2= runiformint(1, 30)
              
              xtset ship
              
              // poisson regression of the first dependent variable
              xtpoisson accident op_75_79 co_65_69 co_70_74 co_75_79, exposure(service) fe
              
              // poisson regression of the second dependent variable
              xtpoisson outcome2 op_75_79 co_65_69 co_70_74 co_75_79, exposure(service) fe
              Question, how can I estimate an seemingly unrelated regression for the above two xtpoisson regressions?
              Last edited by 高佳; 14 Feb 2023, 00:00.

              Comment


              • #8
                Note that ppmlhdfe is from SSC.

                Code:
                webuse ships, clear
                
                set seed 02132023
                gen outcome2= runiformint(1, 30)
                
                *poisson regression of the first dependent variable
                ppmlhdfe accident op_75_79 co_65_69 co_70_74 co_75_79, absorb(ship) 
                
                *poisson regression of the second dependent variable
                ppmlhdfe outcome2 op_75_79 co_65_69 co_70_74 co_75_79, absorb(ship) 
                
                rename (accident outcome2) depvar=
                gen long obs=_n
                reshape long depvar, i(obs) j(which) string
                encode which, gen(Which)
                gen ship1= 1.Which#c.ship
                gen ship2= 2.Which#c.ship
                ppmlhdfe depvar i.Which#(c.op_75_79 c.co_65_69 c.co_70_74 c.co_75_79), absorb(ship1 ship2)
                Res.:

                Code:
                .
                . ppmlhdfe accident op_75_79 co_65_69 co_70_74 co_75_79, absorb(ship) 
                Iteration 1:   deviance = 1.6167e+02  eps = .         iters = 1    tol = 1.0e-04  min(eta) =  -2.14  P  
                Iteration 2:   deviance = 1.3971e+02  eps = 1.57e-01  iters = 1    tol = 1.0e-04  min(eta) =  -2.56      
                Iteration 3:   deviance = 1.3909e+02  eps = 4.46e-03  iters = 1    tol = 1.0e-04  min(eta) =  -2.69      
                Iteration 4:   deviance = 1.3909e+02  eps = 1.15e-05  iters = 1    tol = 1.0e-04  min(eta) =  -2.70      
                Iteration 5:   deviance = 1.3909e+02  eps = 2.66e-10  iters = 1    tol = 1.0e-05  min(eta) =  -2.70   S O
                ------------------------------------------------------------------------------------------------------------
                (legend: p: exact partial-out   s: exact solver   h: step-halving   o: epsilon below tolerance)
                Converged in 5 iterations and 5 HDFE sub-iterations (tol = 1.0e-08)
                
                HDFE PPML regression                              No. of obs      =         34
                Absorbing 1 HDFE group                            Residual df     =         25
                                                                  Wald chi2(4)    =       8.14
                Deviance             =  139.0852637               Prob > chi2     =     0.0866
                Log pseudolikelihood = -118.4758775               Pseudo R2       =     0.6674
                ------------------------------------------------------------------------------
                             |               Robust
                    accident |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                    op_75_79 |   .2928003   .2736869     1.07   0.285    -.2436162    .8292168
                    co_65_69 |   .5824489   .3063304     1.90   0.057    -.0179477    1.182846
                    co_70_74 |   .4627844   .3591039     1.29   0.197    -.2410462    1.166615
                    co_75_79 |  -.1951267   .3849763    -0.51   0.612    -.9496664     .559413
                       _cons |    2.48605   .3500144     7.10   0.000     1.800034    3.172066
                ------------------------------------------------------------------------------
                
                Absorbed degrees of freedom:
                -----------------------------------------------------+
                 Absorbed FE | Categories  - Redundant  = Num. Coefs |
                -------------+---------------------------------------|
                        ship |         5           0           5     |
                -----------------------------------------------------+
                
                .
                .
                .
                . *poisson regression of the second dependent variable
                
                .
                . ppmlhdfe outcome2 op_75_79 co_65_69 co_70_74 co_75_79, absorb(ship) 
                Iteration 1:   deviance = 1.1280e+02  eps = .         iters = 1    tol = 1.0e-04  min(eta) =   0.23  P  
                Iteration 2:   deviance = 1.1229e+02  eps = 4.48e-03  iters = 1    tol = 1.0e-04  min(eta) =   0.19      
                Iteration 3:   deviance = 1.1229e+02  eps = 1.12e-06  iters = 1    tol = 1.0e-04  min(eta) =   0.19      
                Iteration 4:   deviance = 1.1229e+02  eps = 8.50e-14  iters = 1    tol = 1.0e-05  min(eta) =   0.19      
                Iteration 5:   deviance = 1.1229e+02  eps = 1.14e-16  iters = 1    tol = 1.0e-07  min(eta) =   0.19   S O
                ------------------------------------------------------------------------------------------------------------
                (legend: p: exact partial-out   s: exact solver   h: step-halving   o: epsilon below tolerance)
                Converged in 5 iterations and 5 HDFE sub-iterations (tol = 1.0e-08)
                
                HDFE PPML regression                              No. of obs      =         40
                Absorbing 1 HDFE group                            Residual df     =         31
                                                                  Wald chi2(4)    =      10.65
                Deviance             =  112.2943475               Prob > chi2     =     0.0308
                Log pseudolikelihood = -145.4744978               Pseudo R2       =     0.0997
                ------------------------------------------------------------------------------
                             |               Robust
                    outcome2 |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                -------------+----------------------------------------------------------------
                    op_75_79 |   .0759859    .127497     0.60   0.551    -.1739037    .3258755
                    co_65_69 |  -.2307098   .1882286    -1.23   0.220     -.599631    .1382115
                    co_70_74 |  -.5483727   .1701134    -3.22   0.001    -.8817889   -.2149565
                    co_75_79 |   -.218131   .1553536    -1.40   0.160    -.5226185    .0863565
                       _cons |   2.958756   .1406329    21.04   0.000     2.683121    3.234391
                ------------------------------------------------------------------------------
                
                Absorbed degrees of freedom:
                -----------------------------------------------------+
                 Absorbed FE | Categories  - Redundant  = Num. Coefs |
                -------------+---------------------------------------|
                        ship |         5           0           5     |
                -----------------------------------------------------+
                
                .
                .
                
                
                .
                . ppmlhdfe depvar i.Which#(c.op_75_79 c.co_65_69 c.co_70_74 c.co_75_79), absorb(ship1 ship2)
                Iteration 1:   deviance = 2.7726e+02  eps = .         iters = 2    tol = 1.0e-04  min(eta) =  -1.68  P  
                Iteration 2:   deviance = 2.5243e+02  eps = 9.83e-02  iters = 2    tol = 1.0e-04  min(eta) =  -2.22      
                Iteration 3:   deviance = 2.5139e+02  eps = 4.17e-03  iters = 2    tol = 1.0e-04  min(eta) =  -2.42      
                Iteration 4:   deviance = 2.5138e+02  eps = 2.86e-05  iters = 2    tol = 1.0e-04  min(eta) =  -2.44      
                Iteration 5:   deviance = 2.5138e+02  eps = 3.23e-09  iters = 2    tol = 1.0e-05  min(eta) =  -2.44      
                Iteration 6:   deviance = 2.5138e+02  eps = 0.00e+00  iters = 1    tol = 1.0e-06  min(eta) =  -2.44   S  
                Iteration 7:   deviance = 2.5138e+02  eps = 0.00e+00  iters = 1    tol = 1.0e-09  min(eta) =  -2.44   S O
                ------------------------------------------------------------------------------------------------------------
                (legend: p: exact partial-out   s: exact solver   h: step-halving   o: epsilon below tolerance)
                Converged in 7 iterations and 12 HDFE sub-iterations (tol = 1.0e-08)
                
                HDFE PPML regression                              No. of obs      =         74
                Absorbing 2 HDFE groups                           Residual df     =         56
                                                                  Wald chi2(8)    =      19.04
                Deviance             =  251.3796112               Prob > chi2     =     0.0146
                Log pseudolikelihood = -263.9503753               Pseudo R2       =     0.5091
                ----------------------------------------------------------------------------------
                                 |               Robust
                          depvar |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
                -----------------+----------------------------------------------------------------
                Which#c.op_75_79 |
                       accident  |   .2928003   .2714732     1.08   0.281    -.2392774     .824878
                       outcome2  |   .0759859   .1267526     0.60   0.549    -.1724446    .3244164
                                 |
                Which#c.co_65_69 |
                       accident  |   .5824489   .3038526     1.92   0.055    -.0130912    1.177989
                       outcome2  |  -.2307098   .1871295    -1.23   0.218    -.5974769    .1360573
                                 |
                Which#c.co_70_74 |
                       accident  |   .4627844   .3561992     1.30   0.194    -.2353533    1.160922
                       outcome2  |  -.5483727   .1691201    -3.24   0.001    -.8798421   -.2169033
                                 |
                Which#c.co_75_79 |
                       accident  |  -.1951267   .3818626    -0.51   0.609    -.9435637    .5533103
                       outcome2  |   -.218131   .1544465    -1.41   0.158    -.5208406    .0845786
                                 |
                           _cons |    2.78843   .1537793    18.13   0.000     2.487028    3.089832
                ----------------------------------------------------------------------------------
                
                Absorbed degrees of freedom:
                -----------------------------------------------------+
                 Absorbed FE | Categories  - Redundant  = Num. Coefs |
                -------------+---------------------------------------|
                       ship1 |         6           0           6     |
                       ship2 |         6           2           4     |
                -----------------------------------------------------+
                
                .

                Comment


                • #9
                  Hi Andrew, thanks so much. Could you please help me to understand how the procedure you presented is a seemingly unrelated regression? I'm new to this material.

                  Comment


                  • #10
                    The Stata manual entries of the sureg and suest commands have some good discussions on this topic. See https://www.stata.com/manuals/rsureg.pdf and https://www.stata.com/manuals/rsuest.pdf, respectively.
                    Last edited by Andrew Musau; 15 Feb 2023, 00:33.

                    Comment


                    • #11
                      Thanks. I'll check these and your code above and get back to you soon.

                      Comment


                      • #12
                        Dear Andrew Musau , do you have moment to recast your example above above to fit the case in which the stacking is requited in order to test for joint significance of one key independent variable across a sequence of models? I'm trying to replicate it but estimates are not the same, so I'm probably making some elementary mistakes there. Many thanks.

                        [edit]

                        I'm afraid I might have a collinearity problem. As I need to test the joint significance of not two but six coefficients (from six slightly different discretizations of an underlying continuous treatment variable), I have to stack six models, and while all 6 treatment effects are estimated, some of the interaction terms with the covariates are dropped for collinearity, which I suppose contaminates all other estimates. Any suggestion as how to best proceed in this case? I'm also using ppmlhdfe, with a substantial number of fixed effects to absorb, and not much time to devote to this problem, so that using suest or changing command are not really feasible options.
                        Last edited by Matteo Pinna Pintor; 09 Apr 2025, 07:42.
                        I'm using StataNow/MP 18.5

                        Comment


                        • #13
                          Ok, perhaps solved as far as the practical purpose of replicating results is concerned - I must have made some fvar notation mistakes. Now reported estimates are identical. However, a strange behavior persists. Somehow the stacked regression fails to exclude base categories for my categorical covariates, and instead proceeds to omit coefficients for the highest-valued categories for collinearity. It also happens if I explicitly specify the base level using

                          Code:
                          ib0.sex#i.Stack
                          (where Stack is the stacking variable). I guess it has to do with this issue.
                          Last edited by Matteo Pinna Pintor; 09 Apr 2025, 10:00.
                          I'm using StataNow/MP 18.5

                          Comment


                          • #14
                            It is problematic for factor variables to specify a 'base' for interactions without the corresponding main effects. This issue has been discussed here: https://www.statalist.org/forums/for...nuous-variable. A workaround is to use the omission operator to omit the relevant interaction term. Using the example from your linked thread, compare:

                            Code:
                            use http://www.stata-press.com/data/r15/nlswork, clear
                            bys idcode: egen any_union = max(union)
                            replace any_union = 0 if missing(any_union)
                            * Reduce number of unique idcodes so areg can actually run
                            keep if idcode <= 100
                            
                            areg ln_wage i.idcode ib68.year#c.any_union, a(year) noomitted
                            areg ln_wage i.idcode i.year#c.any_union o68.year#o.any_union , a(year) noomitted

                            Res.:

                            Code:
                            . areg ln_wage i.idcode ib68.year#c.any_union, a(year) noomitted
                            note: 88.year#c.any_union omitted because of collinearity.
                            
                            Linear regression, absorbing indicators             Number of obs     =    578
                            Absorbed variable: year                             No. of categories =     15
                                                                                F(103, 460)       =   9.63
                                                                                Prob > F          = 0.0000
                                                                                R-squared         = 0.7160
                                                                                Adj R-squared     = 0.6438
                                                                                Root MSE          = 0.2677
                            
                            ----------------------------------------------------------------------------------
                                     ln_wage | Coefficient  Std. err.      t    P>|t|     [95% conf. interval]
                            -----------------+----------------------------------------------------------------
                                      idcode |
                                          2  |  -.3700416    .109583    -3.38   0.001     -.585387   -.1546961
                                          3  |  -.5378929   .1338261    -4.02   0.000    -.8008791   -.2749066
                                          4  |  -.0406853   .1122069    -0.36   0.717    -.2611869    .1798164
                                          5  |   -.251427   .1433988    -1.75   0.080    -.5332248    .0303708
                                          6  |  -.2122527   .1041979    -2.04   0.042    -.4170155   -.0074899
                                          7  |  -.7857883   .1480545    -5.31   0.000    -1.076735   -.4948412
                                          9  |   .0222342   .1073136     0.21   0.836    -.1886514    .2331197
                                         10  |  -.7786264   .1485312    -5.24   0.000     -1.07051   -.4867427
                                         12  |   .6357693   .2162199     2.94   0.003     .2108681    1.060671
                                         13  |   .1835459   .1121797     1.64   0.102    -.0369022     .403994
                                         14  |  -.1919685   .2162199    -0.89   0.375    -.6168697    .2329327
                                         15  |   .3399096   .1233621     2.76   0.006     .0974866    .5823327
                                         16  |    .007062   .1361566     0.05   0.959    -.2605041    .2746281
                                         17  |     .03596   .1609364     0.22   0.823    -.2803017    .3522218
                                         18  |  -.6834794   .1731004    -3.95   0.000    -1.023645   -.3433138
                                         19  |  -.0525687   .1095709    -0.48   0.632    -.2678902    .1627528
                                         20  |   .0452353   .1073136     0.42   0.674    -.1656503    .2561208
                                         21  |  -.2659091   .1691299    -1.57   0.117    -.5982721    .0664538
                                         22  |   .0130551   .1363353     0.10   0.924    -.2548621    .2809723
                                         23  |  -.2644427   .1564131    -1.69   0.092    -.5718155    .0429301
                                         24  |   .0290883   .1041979     0.28   0.780    -.1756745    .2338511
                                         25  |   .0095992   .1362608     0.07   0.944    -.2581717      .27737
                                         26  |  -.2298446    .161151    -1.43   0.154     -.546528    .0868389
                                         27  |  -.0255112   .1811877    -0.14   0.888    -.3815694     .330547
                                         29  |  -.2578997   .1807832    -1.43   0.154    -.6131629    .0973636
                                         30  |  -.5022057   .1286937    -3.90   0.000    -.7551061   -.2493054
                                         33  |  -.4472461   .2998662    -1.49   0.137    -1.036523    .1420312
                                         35  |  -1.009763   .2988907    -3.38   0.001    -1.597123   -.4224026
                                         36  |  -.1828299   .1229328    -1.49   0.138    -.4244094    .0587496
                                         37  |  -.7279486   .1979115    -3.68   0.000    -1.116871   -.3390258
                                         38  |  -.5092014   .1568003    -3.25   0.001    -.8173351   -.2010677
                                         39  |   .1209112    .163291     0.74   0.459    -.1999776    .4418001
                                         40  |     -.4278   .2974751    -1.44   0.151    -1.012379    .1567785
                                         41  |  -.4659988   .1523977    -3.06   0.002    -.7654807   -.1665169
                                         43  |  -.3938707   .2271041    -1.73   0.084    -.8401608    .0524194
                                         44  |   .7924911   .1287705     6.15   0.000     .5394396    1.045543
                                         45  |   .0768195   .1122114     0.68   0.494    -.1436909    .2973299
                                         46  |  -1.036681   .2162199    -4.79   0.000    -1.461582   -.6117799
                                         47  |  -.8106589   .2162199    -3.75   0.000     -1.23556   -.3857577
                                         48  |   -.683635   .1891362    -3.61   0.000    -1.055313    -.311957
                                         49  |  -.4115353   .1620469    -2.54   0.011    -.7299793   -.0930913
                                         50  |   .3465596    .189952     1.82   0.069    -.0267216    .7198408
                                         51  |  -.7558032   .1524402    -4.96   0.000    -1.055369   -.4562377
                                         53  |  -.7471401   .1618003    -4.62   0.000    -1.065099   -.4291807
                                         54  |  -.7229533   .2846661    -2.54   0.011    -1.282361   -.1635461
                                         55  |  -.4989077   .1525549    -3.27   0.001    -.7986985   -.1991169
                                         56  |   .1871993   .1806712     1.04   0.301    -.1678439    .5422425
                                         57  |  -.7015344    .139742    -5.02   0.000    -.9761462   -.4269227
                                         58  |  -.1638478    .176584    -0.93   0.354    -.5108591    .1831635
                                         59  |  -.7608437   .1616794    -4.71   0.000    -1.078565   -.4431219
                                         60  |  -.5382108   .1696545    -3.17   0.002    -.8716047   -.2048168
                                         61  |  -.7623903   .1696949    -4.49   0.000    -1.095864    -.428917
                                         62  |  -.7835276   .1442491    -5.43   0.000    -1.066997   -.5000587
                                         63  |   .5175648   .1128684     4.59   0.000     .2957632    .7393664
                                         64  |  -.4680247   .1359682    -3.44   0.001    -.7352205   -.2008288
                                         65  |   .2934283   .1298615     2.26   0.024      .038233    .5486235
                                         66  |  -.6485326   .1161549    -5.58   0.000    -.8767927   -.4202725
                                         67  |    .024605   .1515661     0.16   0.871    -.2732428    .3224527
                                         68  |  -.0674516   .1812773    -0.37   0.710    -.4236858    .2887826
                                         69  |  -.5045478   .1566088    -3.22   0.001    -.8123051   -.1967905
                                         70  |   .4834114   .2162199     2.24   0.026     .0585101    .9083126
                                         71  |   -.624879   .1440632    -4.34   0.000    -.9079826   -.3417754
                                         72  |  -.4698405   .1229463    -3.82   0.000    -.7114465   -.2282344
                                         73  |   -.309495   .1394909    -2.22   0.027    -.5836134   -.0353766
                                         75  |  -.0388518   .1060218    -0.37   0.714    -.2471988    .1694953
                                         76  |  -.3555371   .2998957    -1.19   0.236    -.9448725    .2337983
                                         77  |  -.7232812   .1806165    -4.00   0.000    -1.078217   -.3683454
                                         78  |   .2397953   .1099522     2.18   0.030     .0237244    .4558662
                                         79  |  -.5189018   .2998662    -1.73   0.084    -1.108179    .0703755
                                         80  |   .0633665   .1887347     0.34   0.737    -.3075225    .4342556
                                         81  |   -.780879   .2090741    -3.73   0.000    -1.191738   -.3700203
                                         82  |   .2654059   .1453692     1.83   0.069    -.0202642    .5510759
                                         83  |  -.4322357   .1551492    -2.79   0.006    -.7371247   -.1273468
                                         84  |  -.0311991   .2998662    -0.10   0.917    -.6204764    .5580783
                                         85  |   -.173676   .1729535    -1.00   0.316    -.5135528    .1662009
                                         86  |   .1510527   .1080122     1.40   0.163    -.0612057    .3633111
                                         87  |   .1865182    .226173     0.82   0.410    -.2579422    .6309785
                                         88  |   -.269514   .2283172    -1.18   0.238    -.7181881      .17916
                                         89  |    .202736    .226173     0.90   0.371    -.2417244    .6471963
                                         91  |   .1610576   .2283172     0.71   0.481    -.2876164    .6097317
                                         92  |  -.1645888   .2850205    -0.58   0.564    -.7246925    .3955148
                                         93  |  -.1890283   .2983347    -0.63   0.527     -.775296    .3972395
                                         94  |  -.6225067   .1229725    -5.06   0.000    -.8641643   -.3808492
                                         95  |  -.8092458    .169853    -4.76   0.000     -1.14303   -.4754618
                                         96  |  -.3884847   .2271702    -1.71   0.088    -.8349047    .0579353
                                         97  |  -.3313162   .1776281    -1.87   0.063    -.6803794    .0177469
                                         98  |  -.0827248   .1242979    -0.67   0.506    -.3269868    .1615373
                                         99  |  -.2651777    .169946    -1.56   0.119    -.5991444    .0687891
                                        100  |   .0715955   .1981293     0.36   0.718    -.3177552    .4609462
                                             |
                            year#c.any_union |
                                         68  |  -.3262181    .143902    -2.27   0.024    -.6090049   -.0434312
                                         69  |  -.2765695    .141243    -1.96   0.051    -.5541309    .0009919
                                         70  |  -.2521625   .1226235    -2.06   0.040    -.4931342   -.0111907
                                         71  |  -.3653427   .1229402    -2.97   0.003    -.6069367   -.1237487
                                         72  |  -.3290641   .1246493    -2.64   0.009    -.5740166   -.0841115
                                         73  |  -.2549638   .1242343    -2.05   0.041    -.4991009   -.0108267
                                         75  |  -.0469036    .124001    -0.38   0.705    -.2905822     .196775
                                         77  |  -.0034648   .1246678    -0.03   0.978    -.2484537    .2415241
                                         78  |  -.0916416   .1355933    -0.68   0.499    -.3581006    .1748173
                                         80  |  -.0130233   .1354706    -0.10   0.923    -.2792413    .2531946
                                         82  |   -.088694   .1278745    -0.69   0.488    -.3399845    .1625966
                                         83  |   .0944071   .1285028     0.73   0.463    -.1581181    .3469324
                                         85  |   .1516057   .1273315     1.19   0.234    -.0986178    .4018292
                                         87  |  -.0011315   .1154813    -0.01   0.992    -.2280677    .2258047
                                             |
                                       _cons |   2.129733   .1141763    18.65   0.000     1.905362    2.354105
                            ----------------------------------------------------------------------------------
                            F test of absorbed indicators: F(14, 460) = 1.537             Prob > F = 0.094
                            
                            .
                            . areg ln_wage i.idcode i.year#c.any_union o68.year#o.any_union , a(year) noomitted
                            
                            Linear regression, absorbing indicators             Number of obs     =    578
                            Absorbed variable: year                             No. of categories =     15
                                                                                F(103, 460)       =   9.63
                                                                                Prob > F          = 0.0000
                                                                                R-squared         = 0.7160
                                                                                Adj R-squared     = 0.6438
                                                                                Root MSE          = 0.2677
                            
                            ----------------------------------------------------------------------------------
                                     ln_wage | Coefficient  Std. err.      t    P>|t|     [95% conf. interval]
                            -----------------+----------------------------------------------------------------
                                      idcode |
                                          2  |  -.3700416    .109583    -3.38   0.001     -.585387   -.1546961
                                          3  |  -.2116748   .1521673    -1.39   0.165     -.510704    .0873544
                                          4  |  -.0406853   .1122069    -0.36   0.717    -.2611869    .1798164
                                          5  |   .0747911   .1571062     0.48   0.634    -.2339438    .3835259
                                          6  |  -.2122527   .1041979    -2.04   0.042    -.4170155   -.0074899
                                          7  |  -.4595702    .164011    -2.80   0.005     -.781874   -.1372665
                                          9  |   .0222342   .1073136     0.21   0.836    -.1886514    .2331197
                                         10  |  -.4524084   .1642162    -2.75   0.006    -.7751154   -.1297014
                                         12  |   .9619874   .2393679     4.02   0.000     .4915973    1.432377
                                         13  |   .1835459   .1121797     1.64   0.102    -.0369022     .403994
                                         14  |   .1342495   .2393679     0.56   0.575    -.3361405    .6046396
                                         15  |   .3399096   .1233621     2.76   0.006     .0974866    .5823327
                                         16  |   .3332801   .1569116     2.12   0.034     .0249277    .6416325
                                         17  |   .3621781   .1766203     2.05   0.041     .0150954    .7092608
                                         18  |  -.3572614   .1954827    -1.83   0.068    -.7414112    .0268884
                                         19  |  -.0525687   .1095709    -0.48   0.632    -.2678902    .1627528
                                         20  |   .0452353   .1073136     0.42   0.674    -.1656503    .2561208
                                         21  |   .0603089    .183961     0.33   0.743    -.3011992    .4218171
                                         22  |   .3392731   .1565301     2.17   0.031     .0316705    .6468757
                                         23  |   .0617753   .1776744     0.35   0.728    -.2873788    .4109294
                                         24  |   .0290883   .1041979     0.28   0.780    -.1756745    .2338511
                                         25  |   .3358172   .1544281     2.17   0.030     .0323453    .6392892
                                         26  |   .0963735   .1720149     0.56   0.576     -.241659    .4344059
                                         27  |   .3007068   .1871037     1.61   0.109    -.0669771    .6683907
                                         29  |   .0683184   .1925569     0.35   0.723    -.3100817    .4467185
                                         30  |  -.5022057   .1286937    -3.90   0.000    -.7551061   -.2493054
                                         33  |  -.1210281   .2932195    -0.41   0.680    -.6972438    .4551877
                                         35  |  -.6835449   .3079363    -2.22   0.027    -1.288681   -.0784087
                                         36  |  -.1828299   .1229328    -1.49   0.138    -.4244094    .0587496
                                         37  |  -.4017305   .2014887    -1.99   0.047    -.7976829   -.0057782
                                         38  |  -.1829833   .1673512    -1.09   0.275    -.5118509    .1458843
                                         39  |   .4471293   .1855695     2.41   0.016     .0824603    .8117983
                                         40  |  -.1015819   .3048287    -0.33   0.739    -.7006113    .4974474
                                         41  |  -.1397807   .1638767    -0.85   0.394    -.4618205    .1822591
                                         43  |  -.0676527   .2402415    -0.28   0.778    -.5397595    .4044542
                                         44  |   .7924911   .1287705     6.15   0.000     .5394396    1.045543
                                         45  |   .0768195   .1122114     0.68   0.494    -.1436909    .2973299
                                         46  |   -.710463   .2393679    -2.97   0.003    -1.180853   -.2400729
                                         47  |  -.4844409   .2393679    -2.02   0.044    -.9548309   -.0140508
                                         48  |  -.3574169   .2114911    -1.69   0.092    -.7730254    .0581915
                                         49  |  -.0853172   .1719967    -0.50   0.620    -.4233139    .2526794
                                         50  |   .6727777   .2116227     3.18   0.002     .2569106    1.088645
                                         51  |  -.4295851   .1715977    -2.50   0.013    -.7667976   -.0923726
                                         53  |   -.420922   .1775024    -2.37   0.018    -.7697381   -.0721059
                                         54  |  -.3967353   .3079977    -1.29   0.198    -1.001992    .2085217
                                         55  |  -.1726896   .1638842    -1.05   0.293     -.494744    .1493648
                                         56  |   .5134173   .1887339     2.72   0.007     .1425299    .8843048
                                         57  |  -.3753163   .1576438    -2.38   0.018    -.6851077    -.065525
                                         58  |  -.1638478    .176584    -0.93   0.354    -.5108591    .1831635
                                         59  |  -.4346256   .1733103    -2.51   0.012    -.7752036   -.0940476
                                         60  |  -.2119927   .1834786    -1.16   0.249    -.5725528    .1485675
                                         61  |  -.4361722   .1837062    -2.37   0.018    -.7971797   -.0751648
                                         62  |  -.4573096   .1615363    -2.83   0.005    -.7747502   -.1398689
                                         63  |   .5175648   .1128684     4.59   0.000     .2957632    .7393664
                                         64  |  -.4680247   .1359682    -3.44   0.001    -.7352205   -.2008288
                                         65  |   .2934283   .1298615     2.26   0.024      .038233    .5486235
                                         66  |  -.6485326   .1161549    -5.58   0.000    -.8767927   -.4202725
                                         67  |    .350823   .1717137     2.04   0.042     .0133826    .6882635
                                         68  |   .2587665   .1873553     1.38   0.168    -.1094118    .6269448
                                         69  |  -.1783297   .1709931    -1.04   0.298    -.5143541    .1576947
                                         70  |   .8096294   .2393679     3.38   0.001     .3392394    1.280019
                                         71  |  -.2986609   .1656061    -1.80   0.072    -.6240991    .0267773
                                         72  |  -.4698405   .1229463    -3.82   0.000    -.7114465   -.2282344
                                         73  |    .016723   .1606808     0.10   0.917    -.2990364    .3324825
                                         75  |  -.0388518   .1060218    -0.37   0.714    -.2471988    .1694953
                                         76  |  -.0293191   .3106111    -0.09   0.925    -.6397117    .5810736
                                         77  |  -.3970632   .1945726    -2.04   0.042    -.7794245   -.0147018
                                         78  |   .2397953   .1099522     2.18   0.030     .0237244    .4558662
                                         79  |  -.1926837   .2932195    -0.66   0.511    -.7688995    .3835321
                                         80  |   .3895846   .2117357     1.84   0.066    -.0265045    .8056737
                                         81  |   -.780879   .2090741    -3.73   0.000    -1.191738   -.3700203
                                         82  |   .2654059   .1453692     1.83   0.069    -.0202642    .5510759
                                         83  |  -.1060177   .1724445    -0.61   0.539    -.4448943    .2328589
                                         84  |    .295019   .2932195     1.01   0.315    -.2811967    .8712348
                                         85  |   .1525421   .1949125     0.78   0.434    -.2304871    .5355713
                                         86  |   .1510527   .1080122     1.40   0.163    -.0612057    .3633111
                                         87  |   .5127362   .2360251     2.17   0.030     .0489151    .9765573
                                         88  |    .056704   .2278876     0.25   0.804    -.3911257    .5045338
                                         89  |    .528954   .2360251     2.24   0.025      .065133    .9927751
                                         91  |   .4872757   .2278876     2.14   0.033     .0394459    .9351055
                                         92  |  -.1645888   .2850205    -0.58   0.564    -.7246925    .3955148
                                         93  |   .1371898   .3071951     0.45   0.655    -.4664898    .7408694
                                         94  |  -.6225067   .1229725    -5.06   0.000    -.8641643   -.3808492
                                         95  |  -.4830277     .17858    -2.70   0.007    -.8339615    -.132094
                                         96  |  -.0622666   .2363236    -0.26   0.792    -.5266742    .4021409
                                         97  |  -.3313162   .1776281    -1.87   0.063    -.6803794    .0177469
                                         98  |  -.0827248   .1242979    -0.67   0.506    -.3269868    .1615373
                                         99  |   .0610404   .1830848     0.33   0.739    -.2987458    .4208266
                                        100  |   .3978135   .2014967     1.97   0.049     .0018455    .7937816
                                             |
                            year#c.any_union |
                                         69  |   .0496486   .1487627     0.33   0.739    -.2426902    .3419873
                                         70  |   .0740556   .1368997     0.54   0.589    -.1949707    .3430819
                                         71  |  -.0391246   .1367974    -0.29   0.775    -.3079498    .2297006
                                         72  |   -.002846   .1406131    -0.02   0.984    -.2791696    .2734776
                                         73  |   .0712543   .1403962     0.51   0.612    -.2046432    .3471518
                                         75  |   .2793145   .1391808     2.01   0.045     .0058056    .5528233
                                         77  |   .3227533   .1423174     2.27   0.024     .0430804    .6024261
                                         78  |   .2345764   .1521309     1.54   0.124    -.0643813    .5335342
                                         80  |   .3131947   .1541035     2.03   0.043     .0103605    .6160289
                                         82  |   .2375241   .1465622     1.62   0.106    -.0504903    .5255385
                                         83  |   .4206252   .1480166     2.84   0.005     .1297526    .7114978
                                         85  |   .4778237   .1486281     3.21   0.001     .1857496    .7698978
                                         87  |   .3250866   .1423488     2.28   0.023     .0453522     .604821
                                         88  |   .3262181    .143902     2.27   0.024     .0434312    .6090049
                                             |
                                       _cons |   1.803515   .1358473    13.28   0.000     1.536557    2.070473
                            ----------------------------------------------------------------------------------
                            F test of absorbed indicators: F(14, 460) = 1.537             Prob > F = 0.094
                            Last edited by Andrew Musau; 09 Apr 2025, 17:50.

                            Comment


                            • #15
                              Yes Andrew Musau, I did try that shortly after. I have to exclude many categories because most of my covariates are discrete, and I couldn't find a compact way of doing e.g. it using parentheses. For the model id variable stack, continuous age (and treatment variable, omitted here) and categorical covariates sex education ghs and METclass, the code that works for me is the longish

                              Code:
                              i.stack#c.age##c.age i.stack#i.sex o1.stack#o0.sex o2.stack#o0.sex o3.stack#o0.sex o4.stack#o0.sex o5.stack#o0.sex o6.stack#o0.sex i.stack#i.education o1.stack#o0.education o2.stack#o0.education o3.stack#o0.education o4.stack#o0.education o5.stack#o0.education o6.stack#o0.education i.stack#i.ghs o1.stack#o1.ghs o2.stack#o1.ghs o3.stack#o1.ghs o4.stack#o1.ghs o5.stack#o1.ghs o6.stack#o1.ghs i.stack#i.METclass o1.stack#o1.METclass o2.stack#o1.METclass o3.stack#o1.METclass o4.stack#o1.METclass o5.stack#o1.METclass o6.stack#o1.METclass
                              I tried to find something better, but fvexpand always showed me it didn't work. Any suggestion as how to further summarize this?

                              Moreover, alas, now that I enforce these base levels the model runs into sparsity problems ("warning: variance matrix is nonsymmetric or highly singular."). Given that I'm actually interested in testing hypotheses concerning the treatment variable only, which is continuous and correctly estimated irrespectively of the base levels of the covariates, I'm tempted to give up on this problem.
                              Last edited by Matteo Pinna Pintor; 10 Apr 2025, 03:02.
                              I'm using StataNow/MP 18.5

                              Comment

                              Working...
                              X