Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Interpret results

    Hi, I am quite new in regression, still trying to learn. Initially, i use five independent variables, and after showing to my supervisor, he asked me to add more variables but as sensitivity check so I don't need to redo the whole thing.

    I am wondering if it makes sense to add my additional three variables as sensitivity check, as I normally read papers that only change one variables or so when doing sensitivity check.

    If yes, then how do I interpret the outcome of the new addition? Does that mean my model is robust? Thank you



    Code:
     . reg mtd prof size tang growth liq dc1 dc2 dc3 dc4 i.industry i.year, vce(robust)
    
    Linear regression                               Number of obs     =      4,820
                                                    F(27, 4792)       =      80.96
                                                    Prob > F          =     0.0000
                                                    R-squared         =     0.3569
                                                    Root MSE          =     .18635
    
    ------------------------------------------------------------------------------
                 |               Robust
             mtd |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
    -------------+----------------------------------------------------------------
            prof |  -.3832766   .0686802    -5.58   0.000    -.5179214   -.2486318
            size |   .0361722    .002791    12.96   0.000     .0307006    .0416438
            tang |   .1248675   .0183476     6.81   0.000     .0888977    .1608372
          growth |  -.0083867    .002782    -3.01   0.003    -.0138407   -.0029326
             liq |  -.0102962   .0036118    -2.85   0.004     -.017377   -.0032155
             dc1 |   .0615475   .0120496     5.11   0.000     .0379248    .0851702
             dc2 |   .0420181   .0074354     5.65   0.000     .0274413    .0565949
             dc3 |   .0213742    .008977     2.38   0.017     .0037751    .0389732
             dc4 |   .0781161   .0124258     6.29   0.000     .0537559    .1024763
                 |
        industry |
           9991  |  -.0240652   .0212285    -1.13   0.257    -.0656827    .0175523
           9992  |    .044358   .0242483     1.83   0.067    -.0031799    .0918958
           9993  |   -.189325   .0237744    -7.96   0.000    -.2359337   -.1427163
           9994  |   .0256848   .0208052     1.23   0.217     -.015103    .0664726
           9995  |  -.1267229   .0254384    -4.98   0.000    -.1765938    -.076852
           9996  |    .069327   .0208692     3.32   0.001     .0284139    .1102402
           9997  |  -.0659017   .0222793    -2.96   0.003    -.1095794    -.022224
           9998  |  -.0668792   .0243165    -2.75   0.006    -.1145507   -.0192076
           9999  |  -.0184298   .0227901    -0.81   0.419    -.0631089    .0262493
                 |
            year |
           2009  |  -.0585273   .0127497    -4.59   0.000    -.0835225   -.0335321
           2010  |  -.0834637   .0124864    -6.68   0.000    -.1079427   -.0589847
           2011  |  -.0741264   .0125177    -5.92   0.000    -.0986667    -.049586
           2012  |  -.0744085    .012312    -6.04   0.000    -.0985456   -.0502714
           2013  |  -.1149802   .0122686    -9.37   0.000    -.1390323   -.0909281
           2014  |  -.1233646   .0124957    -9.87   0.000    -.1478619   -.0988673
           2015  |  -.1275209   .0127533   -10.00   0.000    -.1525233   -.1025186
           2016  |  -.1244362   .0129507    -9.61   0.000    -.1498255   -.0990469
           2017  |   -.132976    .012769   -10.41   0.000    -.1580091   -.1079429
                 |
           _cons |   .0279881   .0510031     0.55   0.583    -.0720014    .1279777
    ------------------------------------------------------------------------------
    Code:
     . reg mtd prof size tang growth liq dividendpayout ntds intcoverage dc1 dc2 dc3 dc4 i.industry i.year, vce(robust)
    
    Linear regression                               Number of obs     =      4,820
                                                    F(30, 4789)       =      75.58
                                                    Prob > F          =     0.0000
                                                    R-squared         =     0.3601
                                                    Root MSE          =     .18593
    
    --------------------------------------------------------------------------------
                   |               Robust
               mtd |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
    ---------------+----------------------------------------------------------------
              prof |  -.3903113   .0725805    -5.38   0.000    -.5326024   -.2480202
              size |   .0354715   .0028602    12.40   0.000     .0298643    .0410787
              tang |   .1397064   .0171498     8.15   0.000     .1060849     .173328
            growth |  -.0078057   .0028427    -2.75   0.006    -.0133787   -.0022327
               liq |  -.0104448   .0036926    -2.83   0.005     -.017684   -.0032056
    dividendpayout |  -.0018741   .0010861    -1.73   0.085    -.0040034    .0002552
              ntds |  -.3816906   .1609167    -2.37   0.018    -.6971612     -.06622
       intcoverage |  -1.42e-06   1.22e-06    -1.16   0.246    -3.81e-06    9.78e-07
               dc1 |   .0564218   .0129878     4.34   0.000     .0309596    .0818839
               dc2 |   .0397018   .0076943     5.16   0.000     .0246175    .0547861
               dc3 |    .019007   .0089529     2.12   0.034     .0014553    .0365588
               dc4 |    .074056    .012708     5.83   0.000     .0491424    .0989696
                   |
          industry |
             9991  |  -.0253018   .0210576    -1.20   0.230    -.0665843    .0159807
             9992  |   .0452068   .0240588     1.88   0.060    -.0019594     .092373
             9993  |  -.1899307   .0236659    -8.03   0.000    -.2363267   -.1435348
             9994  |   .0250902   .0206563     1.21   0.225    -.0154057    .0655861
             9995  |  -.1287157   .0254462    -5.06   0.000    -.1786019   -.0788295
             9996  |   .0628662    .020813     3.02   0.003     .0220633    .1036692
             9997  |  -.0576092    .022123    -2.60   0.009    -.1009805   -.0142379
             9998  |  -.0559507    .024707    -2.26   0.024    -.1043879   -.0075136
             9999  |  -.0166227   .0226275    -0.73   0.463    -.0609829    .0277375
                   |
              year |
             2009  |  -.0591305   .0127057    -4.65   0.000    -.0840396   -.0342214
             2010  |  -.0841003   .0124556    -6.75   0.000     -.108519   -.0596816
             2011  |  -.0743277    .012466    -5.96   0.000    -.0987668   -.0498887
             2012  |  -.0741434    .012242    -6.06   0.000    -.0981434   -.0501435
             2013  |  -.1163413   .0121812    -9.55   0.000    -.1402221   -.0924605
             2014  |  -.1243697   .0123403   -10.08   0.000    -.1485625    -.100177
             2015  |  -.1280807   .0126523   -10.12   0.000     -.152885   -.1032764
             2016  |  -.1253472   .0128311    -9.77   0.000    -.1505021   -.1001923
             2017  |  -.1342228   .0126918   -10.58   0.000    -.1591047    -.109341
                   |
             _cons |   .0459086   .0528146     0.87   0.385    -.0576323    .1494496

  • #2
    Larissa:
    results are basically similar.
    The main issue is that you've seemingly performed a fixed effect panel regression via -regress- that rarely outperforms -xtreg,fe- when it come to panel data regression.
    Kind regards,
    Carlo
    (StataNow 18.5)

    Comment


    • #3
      Originally posted by Carlo Lazzaro View Post
      Larissa:
      results are basically similar.
      The main issue is that you've seemingly performed a fixed effect panel regression via -regress- that rarely outperforms -xtreg,fe- when it come to panel data regression.
      Hi Carlo, noted on the xtreg fe. I will rerun again. Meanwhile, if the results are similar, what can I say about my model? Can I say that the model is robust?

      Comment


      • #4
        Larissa:
        if you mean that results basically does not change when you add other predictors, your original model seems robust.
        One of the usual way to compare two (or more) OLS models is to take a look at their adjusted_R_sq (that gives an idea of the efficiency of the model).
        Adjusted_R_sq is not reported by default if you run an OLS with non-default standard errors, but can be retrieved via -ereturn list- after -regress-, as you can see from the following toy-example:
        Code:
        use "http://www.stata-press.com/data/r15/union.dta"
        . reg age black, cluster( idcode)
        
        Linear regression                               Number of obs     =     26,200
                                                        F(1, 4433)        =       3.36
                                                        Prob > F          =     0.0670
                                                        R-squared         =     0.0002
                                                        Root MSE          =     6.4884
        
                                     (Std. Err. adjusted for 4,434 clusters in idcode)
        ------------------------------------------------------------------------------
                     |               Robust
                 age |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
        -------------+----------------------------------------------------------------
               black |   -.227264   .1240422    -1.83   0.067    -.4704486    .0159206
               _cons |   30.49461   .0686782   444.02   0.000     30.35996    30.62925
        ------------------------------------------------------------------------------
        
        . ereturn list
        
        scalars:
                    e(N_clust) =  4434
                          e(N) =  26200
                       e(df_m) =  1
                       e(df_r) =  4433
                          e(F) =  3.356776197760231
                         e(r2) =  .000244306956287
                       e(rmse) =  6.488387438062084
                        e(mss) =  269.5154303763993
                        e(rss) =  1102914.096172636
                       e(r2_a) =  .000206145428955
                         e(ll) =  -86169.55714742451
                       e(ll_0) =  -86172.75795955812
                       e(rank) =  2
        
              *<snip>*
        That said, as robustness can have different meanings, I think that in your research report you should be explicit about that.
        As an aside, I would take the -xtreg,fe- issue up with your supervisor.
        Kind regards,
        Carlo
        (StataNow 18.5)

        Comment


        • #5
          Originally posted by Carlo Lazzaro View Post
          Larissa:
          if you mean that results basically does not change when you add other predictors, your original model seems robust.
          One of the usual way to compare two (or more) OLS models is to take a look at their adjusted_R_sq (that gives an idea of the efficiency of the model).
          Adjusted_R_sq is not reported by default if you run an OLS with non-default standard errors, but can be retrieved via -ereturn list- after -regress-, as you can see from the following toy-example:
          Code:
          use "http://www.stata-press.com/data/r15/union.dta"
          . reg age black, cluster( idcode)
          
          Linear regression Number of obs = 26,200
          F(1, 4433) = 3.36
          Prob > F = 0.0670
          R-squared = 0.0002
          Root MSE = 6.4884
          
          (Std. Err. adjusted for 4,434 clusters in idcode)
          ------------------------------------------------------------------------------
          | Robust
          age | Coef. Std. Err. t P>|t| [95% Conf. Interval]
          -------------+----------------------------------------------------------------
          black | -.227264 .1240422 -1.83 0.067 -.4704486 .0159206
          _cons | 30.49461 .0686782 444.02 0.000 30.35996 30.62925
          ------------------------------------------------------------------------------
          
          . ereturn list
          
          scalars:
          e(N_clust) = 4434
          e(N) = 26200
          e(df_m) = 1
          e(df_r) = 4433
          e(F) = 3.356776197760231
          e(r2) = .000244306956287
          e(rmse) = 6.488387438062084
          e(mss) = 269.5154303763993
          e(rss) = 1102914.096172636
          e(r2_a) = .000206145428955
          e(ll) = -86169.55714742451
          e(ll_0) = -86172.75795955812
          e(rank) = 2
          
          *<snip>*
          That said, as robustness can have different meanings, I think that in your research report you should be explicit about that.
          As an aside, I would take the -xtreg,fe- issue up with your supervisor.
          You explained clearly on the interpretation. I will ask him. Thank you so much Mr. Carlo.

          Comment


          • #6
            Larissa:
            Carlo is far enough!
            Kind regards,
            Carlo
            (StataNow 18.5)

            Comment

            Working...
            X