Dear Statalist community,
I am using a linear regression model with multiple independent variables. To account for heteroscedasticity in the error term, I use the vce(robust) option after the regress command.
In the regression output Stata does not provide adjusted R-squared, instead, it only reports R-squared.
1) Adjusted R-squared is not reported because it is not reliable anymore - why exactly? Is there a publication showing why it is not reliable anymore?
2) R-squared, F-test and Root MSE are reported. Are they still reliable measures of goodness of fit? Why?
3) Since adjusted R-squared is not meaningful, how can I now assess the improvement of goodness of fit when adding independent variables to the model?
Thank you,
Thomas
I am using a linear regression model with multiple independent variables. To account for heteroscedasticity in the error term, I use the vce(robust) option after the regress command.
In the regression output Stata does not provide adjusted R-squared, instead, it only reports R-squared.
1) Adjusted R-squared is not reported because it is not reliable anymore - why exactly? Is there a publication showing why it is not reliable anymore?
2) R-squared, F-test and Root MSE are reported. Are they still reliable measures of goodness of fit? Why?
3) Since adjusted R-squared is not meaningful, how can I now assess the improvement of goodness of fit when adding independent variables to the model?
Thank you,
Thomas
Comment