Hello everyone, I am running an Oxaca Blinder decomposting with the following command:
oaxaca SALHOUR AGE HOMEWORK HUSUAL ISCO2 ISCED CHILD18 URBAN LNAIS[pweight = EXTRID], by(GENDER)
The output only shows the decomposition at aggregated level. I would need the values for each covariate. How do I have to adapt the code?
-------------------------------------------------------------------------------
| Robust
SALHOUR | Coefficient std. err. z P>|z| [95% conf. interval]
--------------+----------------------------------------------------------------
Differential |
Prediction_1 | 14.48734 .0583054 248.47 0.000 14.37306 14.60161
Prediction_2 | 13.23098 .0500645 264.28 0.000 13.13285 13.3291
Difference | 1.256357 .0768503 16.35 0.000 1.105733 1.40698
--------------+----------------------------------------------------------------
Decomposition |
Endowments | -.8192414 .0527442 -15.53 0.000 -.9226182 -.7158645
Coefficients | 1.984337 .0775695 25.58 0.000 1.832304 2.13637
Interaction | .0912607 .0501451 1.82 0.069 -.0070218 .1895433
-------------------------------------------------------------------------------
Thanks a lot!
oaxaca SALHOUR AGE HOMEWORK HUSUAL ISCO2 ISCED CHILD18 URBAN LNAIS[pweight = EXTRID], by(GENDER)
The output only shows the decomposition at aggregated level. I would need the values for each covariate. How do I have to adapt the code?
-------------------------------------------------------------------------------
| Robust
SALHOUR | Coefficient std. err. z P>|z| [95% conf. interval]
--------------+----------------------------------------------------------------
Differential |
Prediction_1 | 14.48734 .0583054 248.47 0.000 14.37306 14.60161
Prediction_2 | 13.23098 .0500645 264.28 0.000 13.13285 13.3291
Difference | 1.256357 .0768503 16.35 0.000 1.105733 1.40698
--------------+----------------------------------------------------------------
Decomposition |
Endowments | -.8192414 .0527442 -15.53 0.000 -.9226182 -.7158645
Coefficients | 1.984337 .0775695 25.58 0.000 1.832304 2.13637
Interaction | .0912607 .0501451 1.82 0.069 -.0070218 .1895433
-------------------------------------------------------------------------------
Thanks a lot!
Comment