Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Twofold Oaxaca Blinder Decomposition Interpretation

    Hello, everyone. Could you help me with interpreting the results of oaxaca blinder twofold decomposition?

    I have been reading several threads here and related literature but am still not so sure about if my interpretation is correct or not (especially when I use the option weight(0)).

    Code:
    oaxaca ln_ahw male age_in_syear age_in_syear_square education_dummy exp occupation_coded2-occupation_coded3 region1, by(migrant) weight(0) categorical(occupation_coded?, region?)
    oaxaca, eform
    My result:
    Group 1: migrant = 0
    Group 2: migrant = 1
    ln_ahw | exp(b) Std. err. z P>|z| [95% conf. interval]
    -------------+----------------------------------------------------------------
    overall |
    group_1 | 15.97559 .0975334 453.89 0.000 15.78556 16.1679
    group_2 | 12.50604 .1236698 255.46 0.000 12.26599 12.75079
    difference | 1.27743 .0148458 21.07 0.000 1.248661 1.306861
    explained | 1.159331 .0166044 10.32 0.000 1.127239 1.192336
    unexplained | 1.101868 .01621 6.59 0.000 1.070551 1.134102
    -------------+----------------------------------------------------------------

    I would like to interpret it as:
    The endowment effect here quantifies the scenario where migrants, if they possessed the same characteristics as natives, would expect to earn about 15.93% (calculated as (1.1593-1)*100%) higher wages on average.
    Applying the wage coefficients of migrants to the characteristics of natives suggests an expected average wage decrease of 10.19% for natives.

    Could you pls tell me if I have understand it correctly or not?

    Thanks a lot in advance!

  • #2
    Something funky going on. The difference between 15.97559 and 12.50604 is not 1.27743, which is also not equal to the sum of 1.159331+1.101868. The difference should equal group_1 - group_2 = explained + unexplained = difference, which is not true.

    and exp(15.97559) = 8,671,827. These cats are making the big bucks.
    Last edited by George Ford; 15 Jan 2024, 18:21.

    Comment


    • #3
      Originally posted by George Ford View Post
      Something funky going on. The difference between 15.97559 and 12.50604 is not 1.27743, which is also not equal to the sum of 1.159331+1.101868. The difference should equal group_1 - group_2 = explained + unexplained = difference, which is not true.

      and exp(15.97559) = 8,671,827. These cats are making the big bucks.
      Thank you so much for your reply! I think it is because I have used 'oaxaca, eform' to transform the results into exponential form already, therefore 1.27743 comes out from (15.97559 - 12.50604) / 12.50604 +1.

      Best regards

      Comment


      • #4
        Ahh, I see now.

        Comment


        • #5
          Here's the calculations by hand.

          I think your interpretation is correct, as the exponentiated log difference is the ratio of the values. So -1 would give you the percentage.

          I'd note, however, that you normally adjust the exponentiated predictions by the (RMSE^2)/2, which oaxaca is not doing.

          Code:
          clear all
          sysuse auto, clear
          
          oaxaca mpg weight displacement, by(foreign) weight(0) noisily
          qui reg mpg weight displacement if foreign
          matrix P = e(b)
          qui reg mpg weight displacement if ~foreign
          matrix Z = e(b)
          
          tabstat weight displacement foreign, by(foreign) save
          matrix Ax = r(Stat1)
          matrix Bx = r(Stat2)
          
          di "group_1" _col(20) Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3]
          di "group_2" _col(20) P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3]
          di "difference" _col(20) (Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3]) - (P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3])
          di "explained" _col(20) P[1,1]*(Ax[1,1]-Bx[1,1]) + P[1,2]*(Ax[1,2]-Bx[1,2])
          di "unexplained" _col(20) (Z[1,1] - P[1,1])*Ax[1,1]+ (Z[1,2] - P[1,2])*Ax[1,2] + (Z[1,3] - P[1,3])
          
          foreach var in mpg weight displacement {
              g l`var' = ln(`var')
          }
          
          eststo e1: oaxaca lmpg lweight ldisplacement, by(foreign) weight(0) noisily
          qui reg lmpg lweight ldisplacement if foreign
          matrix P = e(b)
          
          qui tabstat lweight ldisplacement foreign, by(foreign) save
          matrix Ax = r(Stat1)
          matrix Bx = r(Stat2)
          
          di "group_1" _col(20) Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3]
          di "group_2" _col(20) P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3]
          di "difference" _col(20) (Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3]) - (P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3])
          di "explained" _col(20) P[1,1]*(Ax[1,1]-Bx[1,1]) + P[1,2]*(Ax[1,2]-Bx[1,2])
          di "unexplained" _col(20) (Z[1,1] - P[1,1])*Ax[1,1]+ (Z[1,2] - P[1,2])*Ax[1,2] + (Z[1,3] - P[1,3])
          
          
          estimates restore e1
          oaxaca, eform
          
          di "group_1" _col(20) exp(Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3])
          di "group_2" _col(20) exp(P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3])
          di "difference" _col(20) exp((Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3]) - (P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3]))
          di "difference(alt)" _col(20) exp(Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3])/exp(P[1,1]*Bx[1,1] + P[1,2]*Bx[1,2] + P[1,3])
          di "explained" _col(20) exp(P[1,1]*(Ax[1,1]-Bx[1,1]) + P[1,2]*(Ax[1,2]-Bx[1,2]))
          di "explained(alt)" _col(20) exp(P[1,1]*(Ax[1,1]) + P[1,2]*(Ax[1,2]))/exp(P[1,1]*(Bx[1,1]) + P[1,2]*(Bx[1,2]))
          di "unexplained" _col(20) exp((Z[1,1] - P[1,1])*Ax[1,1] + (Z[1,2] - P[1,2])*Ax[1,2] + (Z[1,3] - P[1,3]))
          di "unexplained(alt)" _col(20) exp(Z[1,1]*Ax[1,1] + Z[1,2]*Ax[1,2] + Z[1,3])/exp(P[1,1]*Ax[1,1] + P[1,2]*Ax[1,2] + P[1,3])

          Comment


          • #6
            Respected Professor George Ford is there a way to get detailed result for fairlie decomposition?

            with the fairlie, option, detail is not allowed
            Best regards,
            Mukesh

            (Stata 15.1 SE)

            Comment


            • #7
              This gets you the main stuff. Not sure what he's doing to get the coefficient effects.

              Code:
              clear all
              
              use http://fmwww.bc.edu/RePEc/bocode/h/homecomp.dta
              
              keep if white | black
              
              tabstat female age , by(white) save
              matrix M = r(Stat1) \ r(Stat2)
              matrix list M
              
              fairlie homecomp female age , by(black) 
              
              eststo e1: logit homecomp i.black#(c.female c.age) black
              margins, asobserved over(black) post
              lincom 0.black - 1.black
              di "Difference = " %10.0g r(estimate) " Z-stat = " %5.2f r(z) " Prob = " %5.3f r(p)
              
              eststo e2: logit homecomp female age  if white
              capture drop yfit1
              predict yfit1, pr
              reg yfit1 black
              local exp = r(table)[1,1]
              local zexp = r(table)[3,1]
              local zprob = r(table)[4,1]
              di "Total Explained = "%10.0g `exp' " t-stat = " %5.2f `zexp' " Prob = " %5.3f `zprob'
              Last edited by George Ford; 05 Sep 2024, 19:37.

              Comment


              • #8
                Thank you George Ford for your response. In my case, the last four lines are not working, and when I follow your code "local exp = r(table)[1,1]", it shows invalid syntax.


                I am sharing real data here-

                where outcome var is -
                Code:
                cl_cookfuel

                group variables is -
                Code:
                residence_d1

                predictors are -
                Code:
                caste3_d1 caste3_d2 caste3_d3 q_mpce_d1 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5 sex_hh_d1 sex_hh_d2




                Code:
                * Example generated by -dataex-. To install: ssc install dataex
                clear
                input float cl_cookfuel byte(residence_d1 residence_d2 caste3_d1 caste3_d2 caste3_d3 q_mpce_d1 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5) float age_hh
                1 1 0 0 0 1 0 0 0 0 1 51
                1 1 0 0 1 0 1 0 0 0 0 52
                1 1 0 0 1 0 0 0 0 0 1 35
                1 1 0 1 0 0 1 0 0 0 0 38
                1 1 0 0 1 0 0 0 0 0 1 56
                1 1 0 0 1 0 0 0 0 0 1 55
                1 1 0 0 0 1 0 0 0 0 1 65
                0 1 0 0 1 0 0 1 0 0 0 58
                1 1 0 0 0 1 0 0 0 0 1 21
                1 1 0 0 1 0 0 0 0 1 0 62
                1 1 0 0 0 1 0 0 0 1 0 38
                1 1 0 1 0 0 0 0 0 0 1 46
                1 1 0 1 0 0 0 0 0 1 0 72
                1 1 0 0 1 0 0 0 0 0 1 32
                1 1 0 0 0 1 0 0 0 0 1 65
                1 1 0 0 1 0 0 0 0 1 0 70
                1 1 0 0 0 1 0 0 1 0 0 50
                1 1 0 0 1 0 0 0 0 0 1 55
                1 1 0 0 1 0 0 0 0 1 0 52
                1 1 0 1 0 0 0 0 0 0 1 37
                1 1 0 0 1 0 0 0 0 0 1 51
                1 1 0 1 0 0 0 0 0 1 0 80
                1 1 0 0 1 0 0 0 1 0 0 80
                1 1 0 0 1 0 0 0 1 0 0 38
                0 1 0 0 1 0 0 0 0 1 0 50
                1 1 0 0 1 0 0 0 0 1 0 73
                1 1 0 0 0 1 0 0 0 0 1 41
                1 1 0 1 0 0 0 1 0 0 0 30
                1 1 0 0 0 1 0 0 0 0 1 40
                1 1 0 1 0 0 0 0 0 1 0 29
                1 1 0 0 0 1 0 0 0 0 1 52
                1 1 0 0 1 0 0 0 0 0 1 52
                0 1 0 0 1 0 0 0 0 0 1 58
                1 1 0 0 0 1 0 0 0 1 0 59
                1 1 0 0 0 1 0 0 0 0 1 50
                1 1 0 0 0 1 0 0 0 1 0 50
                1 1 0 0 0 1 0 0 0 0 1 19
                1 1 0 1 0 0 0 0 0 1 0 39
                1 1 0 0 1 0 0 0 1 0 0 68
                1 1 0 1 0 0 0 0 0 0 1 63
                1 1 0 0 1 0 0 0 0 0 1 39
                0 1 0 0 1 0 1 0 0 0 0 40
                1 1 0 0 1 0 0 0 0 1 0 68
                1 1 0 0 0 1 0 0 0 1 0 36
                1 1 0 0 1 0 0 0 0 1 0 56
                0 1 0 0 0 1 0 0 0 0 1 47
                1 1 0 0 0 1 0 0 0 0 1 40
                1 1 0 1 0 0 0 0 0 1 0 38
                1 1 0 0 1 0 0 0 0 0 1 35
                1 1 0 0 0 1 0 0 0 0 1 40
                1 1 0 0 0 1 0 0 0 0 1 76
                1 1 0 0 1 0 0 0 0 0 1 55
                1 1 0 0 1 0 0 0 0 0 1 39
                1 1 0 0 0 1 0 0 0 0 1 70
                1 1 0 1 0 0 0 0 0 1 0 22
                1 1 0 0 0 1 0 0 1 0 0 70
                1 1 0 0 1 0 0 1 0 0 0 55
                1 1 0 0 1 0 0 0 0 1 0 62
                0 1 0 1 0 0 1 0 0 0 0 60
                1 1 0 0 0 1 0 0 0 0 1 38
                1 1 0 1 0 0 0 0 0 0 1 50
                1 1 0 0 1 0 0 0 0 0 1 44
                0 1 0 1 0 0 0 1 0 0 0 61
                1 1 0 0 0 1 0 0 0 0 1 68
                1 1 0 0 0 1 1 0 0 0 0 40
                1 1 0 1 0 0 0 0 1 0 0 35
                1 1 0 0 1 0 0 0 1 0 0 66
                1 1 0 0 1 0 0 1 0 0 0 58
                1 1 0 0 0 1 0 0 0 0 1 52
                1 1 0 1 0 0 0 0 0 0 1 40
                1 1 0 0 1 0 0 1 0 0 0 69
                1 1 0 1 0 0 0 0 0 1 0 47
                1 1 0 0 1 0 0 0 0 1 0 40
                1 1 0 0 1 0 0 0 0 0 1 51
                1 1 0 1 0 0 0 0 0 1 0 42
                1 1 0 0 1 0 0 0 0 1 0 39
                1 1 0 1 0 0 0 0 0 1 0 72
                1 1 0 0 0 1 0 0 0 1 0 79
                1 1 0 0 1 0 0 0 0 0 1 48
                1 1 0 1 0 0 0 1 0 0 0 52
                1 1 0 0 1 0 0 0 0 1 0 55
                1 1 0 0 1 0 0 0 0 1 0 42
                1 1 0 0 0 1 0 0 0 0 1 54
                1 1 0 1 0 0 0 0 0 0 1 43
                1 1 0 0 1 0 0 0 0 1 0 56
                1 1 0 0 0 1 0 0 0 0 1 65
                1 1 0 0 0 1 0 0 0 1 0 35
                1 1 0 0 1 0 0 1 0 0 0 65
                1 1 0 0 1 0 0 0 0 0 1 72
                1 1 0 1 0 0 0 0 0 1 0 37
                1 1 0 0 0 1 0 0 0 0 1 70
                1 1 0 0 0 1 0 0 0 0 1 71
                1 1 0 0 1 0 0 0 0 0 1 38
                1 1 0 0 1 0 0 0 0 0 1 38
                1 1 0 0 1 0 0 0 0 1 0 42
                1 1 0 1 0 0 0 0 0 0 1 65
                0 1 0 0 1 0 1 0 0 0 0 48
                1 1 0 0 0 1 0 0 0 0 1 56
                0 1 0 0 0 1 0 0 1 0 0 35
                1 1 0 0 1 0 0 0 0 0 1 33
                1 1 0 0 1 0 0 1 0 0 0 46
                1 1 0 1 0 0 0 0 0 0 1 42
                1 1 0 0 1 0 0 0 0 0 1 36
                1 1 0 0 0 1 0 0 0 0 1 40
                1 1 0 0 1 0 0 0 0 1 0 74
                1 1 0 0 1 0 0 1 0 0 0 35
                1 1 0 0 1 0 0 0 0 1 0 56
                1 1 0 1 0 0 0 0 0 1 0 68
                1 1 0 1 0 0 0 0 0 1 0 58
                1 1 0 0 1 0 0 0 0 1 0 38
                1 1 0 0 0 1 0 0 0 1 0 50
                1 1 0 0 0 1 0 0 0 0 1 41
                0 0 1 0 1 0 0 0 1 0 0 28
                1 0 1 0 0 1 0 0 1 0 0 30
                0 0 1 0 1 0 0 1 0 0 0 44
                1 0 1 0 1 0 0 0 0 0 1 30
                0 0 1 0 0 1 0 0 1 0 0 80
                0 0 1 1 0 0 0 1 0 0 0 51
                0 0 1 0 1 0 0 1 0 0 0 44
                0 0 1 0 1 0 1 0 0 0 0 45
                0 0 1 1 0 0 0 1 0 0 0 60
                1 0 1 1 0 0 0 0 1 0 0 55
                0 0 1 1 0 0 0 0 0 1 0 35
                1 0 1 1 0 0 1 0 0 0 0 45
                0 0 1 1 0 0 1 0 0 0 0 45
                0 0 1 0 1 0 1 0 0 0 0 41
                1 0 1 1 0 0 0 0 0 0 1 62
                0 0 1 0 1 0 0 0 0 0 1 45
                1 0 1 0 0 1 0 0 1 0 0 65
                0 0 1 1 0 0 1 0 0 0 0 48
                1 0 1 0 0 1 0 0 0 1 0 36
                0 0 1 0 1 0 0 0 1 0 0 30
                1 0 1 0 1 0 0 0 0 0 1 62
                0 0 1 1 0 0 1 0 0 0 0 45
                1 0 1 0 1 0 0 0 1 0 0 40
                1 0 1 0 1 0 0 0 1 0 0 52
                1 0 1 0 0 1 1 0 0 0 0 48
                1 0 1 1 0 0 0 0 0 0 1 62
                0 0 1 0 1 0 0 0 1 0 0 62
                1 0 1 0 1 0 0 1 0 0 0 65
                0 0 1 1 0 0 0 0 1 0 0 34
                0 0 1 0 1 0 1 0 0 0 0 43
                0 0 1 1 0 0 0 1 0 0 0 65
                0 0 1 1 0 0 1 0 0 0 0 32
                1 0 1 1 0 0 1 0 0 0 0 55
                0 0 1 1 0 0 0 0 1 0 0 26
                1 0 1 1 0 0 1 0 0 0 0 75
                1 0 1 0 0 1 0 0 0 0 1 47
                0 0 1 0 0 1 1 0 0 0 0 52
                1 0 1 1 0 0 0 0 1 0 0 62
                0 0 1 1 0 0 1 0 0 0 0 48
                1 0 1 1 0 0 0 0 1 0 0 30
                1 0 1 1 0 0 1 0 0 0 0 35
                1 0 1 0 1 0 0 1 0 0 0 34
                0 0 1 0 1 0 0 1 0 0 0 60
                1 0 1 1 0 0 0 1 0 0 0 40
                1 0 1 0 1 0 0 1 0 0 0 55
                0 0 1 0 0 1 0 1 0 0 0 53
                1 0 1 1 0 0 0 0 1 0 0 60
                1 0 1 0 1 0 0 0 0 1 0 50
                0 0 1 1 0 0 0 1 0 0 0 27
                1 0 1 1 0 0 0 0 1 0 0 85
                0 0 1 0 1 0 1 0 0 0 0 28
                0 0 1 1 0 0 0 1 0 0 0 70
                1 0 1 0 0 1 0 0 0 1 0 49
                1 0 1 0 0 1 1 0 0 0 0 46
                0 0 1 0 1 0 0 0 0 1 0 73
                0 0 1 0 1 0 0 1 0 0 0 43
                0 0 1 0 1 0 0 0 1 0 0 47
                1 0 1 0 0 1 0 0 0 0 1 78
                1 0 1 0 1 0 0 0 1 0 0 58
                1 0 1 0 1 0 0 1 0 0 0 65
                1 0 1 0 1 0 0 1 0 0 0 80
                0 0 1 0 1 0 0 0 0 1 0 65
                0 0 1 0 1 0 0 0 1 0 0 50
                0 0 1 1 0 0 0 0 0 0 1 25
                1 0 1 1 0 0 1 0 0 0 0 34
                1 0 1 0 1 0 0 0 1 0 0 70
                0 0 1 0 0 1 0 0 1 0 0 45
                0 0 1 1 0 0 1 0 0 0 0 55
                1 0 1 1 0 0 1 0 0 0 0 76
                1 0 1 0 1 0 0 0 1 0 0 54
                0 0 1 1 0 0 0 1 0 0 0 66
                0 0 1 1 0 0 0 1 0 0 0 33
                0 0 1 1 0 0 0 1 0 0 0 53
                1 0 1 0 0 1 0 1 0 0 0 45
                0 0 1 1 0 0 0 1 0 0 0 39
                0 0 1 0 0 1 0 0 1 0 0 48
                0 0 1 0 1 0 0 1 0 0 0 50
                0 0 1 1 0 0 1 0 0 0 0 40
                1 0 1 1 0 0 0 0 1 0 0 47
                0 0 1 0 1 0 0 0 1 0 0 49
                0 0 1 0 1 0 1 0 0 0 0 52
                0 0 1 1 0 0 0 0 0 1 0 50
                1 0 1 0 0 1 0 0 0 1 0 55
                0 0 1 0 1 0 1 0 0 0 0 62
                1 0 1 0 1 0 0 0 0 1 0 75
                0 0 1 0 0 1 1 0 0 0 0 35
                0 0 1 1 0 0 0 1 0 0 0 45
                0 0 1 0 0 1 0 0 1 0 0 75
                0 0 1 0 1 0 1 0 0 0 0 32
                1 0 1 0 0 1 0 0 0 1 0 35
                1 0 1 1 0 0 1 0 0 0 0 75
                1 0 1 0 0 1 0 0 0 0 1 47
                1 0 1 0 0 1 0 0 0 1 0 45
                0 0 1 0 1 0 0 0 1 0 0 45
                1 0 1 0 1 0 0 0 0 1 0 33
                1 0 1 1 0 0 0 0 0 1 0 55
                1 0 1 0 0 1 0 0 0 0 1 02
                0 0 1 1 0 0 1 0 0 0 0 63
                0 0 1 0 1 0 1 0 0 0 0 32
                0 0 1 1 0 0 1 0 0 0 0 36
                0 0 1 0 0 1 0 0 0 0 1 50
                1 0 1 0 1 0 0 0 0 0 1 47
                0 0 1 0 1 0 0 0 0 1 0 50
                0 0 1 1 0 0 1 0 0 0 0 51
                0 0 1 1 0 0 0 0 1 0 0 32
                1 0 1 0 1 0 1 0 0 0 0 53
                0 0 1 1 0 0 0 1 0 0 0 45
                0 0 1 1 0 0 1 0 0 0 0 24
                1 0 1 0 1 0 0 0 0 0 1 42
                1 0 1 0 1 0 0 1 0 0 0 39
                0 0 1 1 0 0 1 0 0 0 0 50
                0 0 1 0 1 0 0 0 1 0 0 27
                1 0 1 1 0 0 0 0 0 1 0 62
                0 0 1 1 0 0 0 0 1 0 0 48
                1 0 1 0 1 0 1 0 0 0 0 48
                0 0 1 1 0 0 0 0 1 0 0 71
                1 0 1 1 0 0 1 0 0 0 0 80
                1 0 1 0 1 0 0 1 0 0 0 60
                0 0 1 0 1 0 0 0 0 1 0 64
                1 0 1 1 0 0 1 0 0 0 0 50
                0 0 1 0 0 1 0 0 0 1 0 79
                1 0 1 0 1 0 0 1 0 0 0 62
                0 0 1 0 1 0 1 0 0 0 0 58
                0 0 1 0 0 1 0 1 0 0 0 40
                1 0 1 0 0 1 0 0 1 0 0 70
                1 0 1 0 1 0 0 0 1 0 0 75
                0 0 1 0 1 0 1 0 0 0 0 55
                1 0 1 0 1 0 0 1 0 0 0 43
                0 0 1 1 0 0 1 0 0 0 0 58
                1 0 1 0 1 0 0 0 0 1 0 45
                1 0 1 0 0 1 0 1 0 0 0 55
                0 0 1 0 1 0 0 0 1 0 0 48
                1 0 1 0 1 0 0 0 0 1 0 46
                1 0 1 0 1 0 0 1 0 0 0 36
                0 0 1 0 1 0 0 1 0 0 0 66
                1 0 1 0 0 1 1 0 0 0 0 30
                1 0 1 0 1 0 0 0 1 0 0 40
                1 0 1 0 1 0 0 1 0 0 0 38
                0 0 1 0 1 0 0 1 0 0 0 60
                1 0 1 0 1 0 0 1 0 0 0 35
                0 0 1 1 0 0 0 0 0 1 0 64
                1 0 1 1 0 0 0 0 1 0 0 80
                1 0 1 1 0 0 0 1 0 0 0 48
                0 0 1 1 0 0 0 1 0 0 0 41
                0 0 1 0 0 1 0 1 0 0 0 45
                1 0 1 1 0 0 0 1 0 0 0 51
                0 0 1 0 1 0 0 0 0 1 0 72
                0 0 1 1 0 0 0 0 0 0 1 60
                0 0 1 1 0 0 0 0 0 1 0 55
                1 0 1 0 0 1 0 0 0 0 1 77
                1 0 1 0 0 1 0 1 0 0 0 45
                1 0 1 0 1 0 0 0 0 0 1 50
                1 0 1 0 1 0 0 0 0 0 1 51
                1 0 1 0 0 1 0 0 0 1 0 34
                0 0 1 0 1 0 1 0 0 0 0 37
                0 0 1 1 0 0 1 0 0 0 0 48
                0 0 1 1 0 0 0 0 1 0 0 25
                1 0 1 0 0 1 1 0 0 0 0 31
                1 0 1 1 0 0 0 0 1 0 0 50
                0 0 1 0 0 1 0 1 0 0 0 40
                0 0 1 0 1 0 0 1 0 0 0 53
                1 0 1 0 1 0 1 0 0 0 0 43
                0 0 1 1 0 0 0 1 0 0 0 29
                0 0 1 1 0 0 0 0 0 1 0 38
                1 0 1 0 1 0 1 0 0 0 0 46
                end
                label values cl_cookfuel yesno
                label def yesno 0 "no", modify
                label def yesno 1 "yes", modify
                label var cl_cookfuel "clean cooking fuel" 
                label var residence_d1 "residence==Urban" 
                label var residence_d2 "residence==Rural" 
                label var caste3_d1 "caste3==STSC" 
                label var caste3_d2 "caste3==OBC" 
                label var caste3_d3 "caste3==Others" 
                label var q_mpce_d1 "q_mpce==Poorest" 
                label var q_mpce_d2 "q_mpce==Poorer" 
                label var q_mpce_d3 "q_mpce==Middle" 
                label var q_mpce_d4 "q_mpce==Richer" 
                label var q_mpce_d5 "q_mpce==Richest"  
                label var age_hh "head age"
                I run the following syntax:-
                Code:
                fairlie cl_cookfuel caste3_d2 caste3_d3 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5 age_hh, by(residence_d2) pool
                AND

                Code:
                oaxaca cl_cookfuel caste3_d2 caste3_d3 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5 age_hh, by(residence_d2) pool logit
                The explained part differs substantially between both models. second method oaxaca gives detailed results and coefficients for both explained and unexplained parts.

                Which method is theoretically appropriate, and how to calculate the % contributions?

                Thank you - Mukesh


                Best regards,
                Mukesh

                (Stata 15.1 SE)

                Comment


                • #9
                  Oaxaca-logit gives you log odds, so the effect sizes will be different.

                  The LPM version of Oaxaca-omega (also pooled model) gives essentially the same main results as Fairlie. The coefficient will differ, but one is logit and the other is LPM so that's expected. That'd be my choice given all the extra detail oaxaca provides.

                  If you use Fairlie-pooled, the results will not match what my code does. It could be fixed to do so.

                  My code works for me. What Stata version are you using?







                  Comment


                  • #10
                    Stata 15.1 SE

                    Just to clarify. I have added this in my signature in my profile. Is it visible at the end of my post(s).
                    Best regards,
                    Mukesh

                    (Stata 15.1 SE)

                    Comment


                    • #11
                      Signature appears.

                      I think pulling off a matrix using [x,x] may be later than 15. But you can find some other way to do it. It might work to matrix R = r(table) and use that. Not sure. When I change to version 15 (code, not actually), it still works, so hard to tell.

                      Comment


                      • #12
                        Originally posted by George Ford View Post
                        Oaxaca-logit gives you log odds, so the effect sizes will be different.

                        The LPM version of Oaxaca-omega (also pooled model) gives essentially the same main results as Fairlie. The coefficient will differ, but one is logit and the other is LPM so that's expected. That'd be my choice given all the extra detail oaxaca provides.

                        If you use Fairlie-pooled, the results will not match what my code does. It could be fixed to do so.

                        Respected George Ford, I tried to follow the way explained, but the results did differ. Can you please elaborate a little more on it and prove that (what I quoted in this post) using dataex data in trailing the post? It will be really helpful.

                        I have access to STATA 18 SE

                        Thank you
                        Best regards,
                        Mukesh

                        (Stata 15.1 SE)

                        Comment


                        • #13
                          Code:
                          fairlie cl_cookfuel caste3_d2 caste3_d3 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5 age_hh, by(residence_d2) pool  ro reps(1000)
                          oaxaca cl_cookfuel caste3_d2 caste3_d3 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5 age_hh, by(residence_d2) omega
                          oaxaca cl_cookfuel caste3_d2 caste3_d3 q_mpce_d2 q_mpce_d3 q_mpce_d4 q_mpce_d5 age_hh, by(residence_d2) omega logit
                          compare the "explained" portion of the latter two to the Fairlie decomposition.

                          There are very different approaches, so they aren't going to match up perfectly.

                          Comment


                          • #14
                            Thank you so much George Ford for clarifying

                            If I correctly understand your post #9 quoted here the better choice is to go with oaxaca with omega & logit option

                            Originally posted by George Ford View Post
                            Oaxaca-logit gives you log odds, so the effect sizes will be different.

                            The LPM version of Oaxaca-omega (also pooled model) gives essentially the same main results as Fairlie. The coefficient will differ, but one is logit and the other is LPM so that's expected. That'd be my choice given all the extra detail oaxaca provides.
                            Another, hopefully final, small query is, what does a large value of the constant in the unexplained part indicate while applying Oaxaca?

                            Thank you - Mukesh
                            Last edited by Mukesh Punia; 25 Feb 2025, 13:42.
                            Best regards,
                            Mukesh

                            (Stata 15.1 SE)

                            Comment


                            • #15
                              oaxaca is a fully interactive model. the unexplained dummy is the dummy on the group. before oaxaca/blinder, people would measure discrimination based just on the dummy. oaxaca/blinder added that discrimination may also occur, for instance, if education was valued differently. the latter is also in the unexplained part.

                              Comment

                              Working...
                              X