Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • How to save Point Estimates and Standard Errors in a .dta file from a Regression Loop?

    Statalisters,

    I am trying to run the following regression loop for different disaster types and then save their regression estimates and SEs in separate .dta files for later use in an event plot. However, the following code produces an output under which the SEs and CIs for the t=0, term are dropped altogether. Can anyone suggest a fix? I have been struggling with this for quite some time and have tried many different code alterations, but nothing works. I am not being able to figure out where am I going wrong!

    Click image for larger version

Name:	Output.png
Views:	1
Size:	14.9 KB
ID:	1775940

    Code:
    * Define disaster variable names
    local vars "Major_Hurr Maj_Tropical_Storm Maj_Thunderstorm_Wind Maj_Tornado Maj_Flash_Flood Maj_Flood"

    foreach x of local vars {
    use "$path0/Sample2_Ready.dta", clear

    * Set reference period
    *replace `x'_F1_2 = 0

    * Run regression
    quietly reghdfe Index_Crime_p ///
    `x'_F9_10 `x'_F7_8 `x'_F5_6 `x'_F3_4 `x'_F1_2 ///
    `x' `x'_L1_2 `x'_L3_4 `x'_L5_6 `x'_L7_8 `x'_L9_10, ///
    absorb(i.fips i.year ///
    i.year#c.Share_Police_1980 ///
    i.year#c.Share_IndexCrime_1980 ///
    i.year#c.Share_Black_1980 ///
    i.year#c.Share_Male_1980 ///
    i.year#c.Share_YoungNoHS_1980 ///
    i.year#c.Share_Unemp_1980 ///
    i.year#c.Share_BPL_1980 ///
    i.year#c.Log_Pop_Density_1980 ///
    i.year#c.Log_CoastDist_1980 ///
    i.year#c.Log_AvgWages_1980) ///
    vce(cluster fips)

    * Capture coefficients
    foreach var in `x' `x'_L1_2 `x'_L3_4 `x'_L5_6 `x'_L7_8 `x'_L9_10 ///
    `x'_F1_2 `x'_F3_4 `x'_F5_6 `x'_F7_8 `x'_F9_10 {
    scalar `var' = _b[`var']
    scalar se_`var' = _se[`var']
    scalar low_`var' = _b[`var'] - invttail(e(df_r), 0.025) * _se[`var']
    scalar high_`var' = _b[`var'] + invttail(e(df_r), 0.025) * _se[`var']
    }

    * Create 11-row event-study data
    clear
    set obs 11
    gen t = _n - 6

    local tvals 0 1 2 3 4 5 -1 -2 -3 -4 -5
    local bvars `x' `x'_L1_2 `x'_L3_4 `x'_L5_6 `x'_L7_8 `x'_L9_10 ///
    `x'_F1_2 `x'_F3_4 `x'_F5_6 `x'_F7_8 `x'_F9_10

    gen b_`x' = .
    gen se_`x' = .
    gen low_`x' = .
    gen high_`x' = .

    forvalues k = 1/11 {
    local tval : word `k' of `tvals'
    local bvar : word `k' of `bvars'

    replace b_`x' = `bvar' if t == `tval'
    replace se_`x' = se_`bvar' if t == `tval'
    replace low_`x' = low_`bvar' if t == `tval'
    replace high_`x' = high_`bvar' if t == `tval'
    }

    * Rescale time
    replace t = t * 2

    order t b_* se_* low_* high_*
    sort t

    * Save output
    save "$path6/Disaster_`x'_FullSample.dta", replace
    }


  • #2
    This is a lot of code to look at. In addition, you have shown no example data, so it's not possible for anybody to try it out and see what's going on in detail when you run it. Now, if you just run -dataex-, I suspect you will not get an adequate example because there are 78 variables already mentioned in the regression command inside the loop and -dataex- output cannot deal with that many.

    My suggestion is that you create a smaller version of this problem. Just use two types of disaster, and instead of having variables running from L1-L10 and F1-F10 make it L1-L2 and F1-F2. Make t run from -2 to 2, adjust the loops accordingly to cover 5 values instead of 11. See if you can get -dataex- to output an example that way which reproduces the problem you are having. If that's still too big, leave out all the variables of the interaction terms in the -absorb- option and try again. Make sure that whatever example data you post does reproduce the problem you are having--i.e. run it yourself first and verify you get the same difficulty.

    Also, please post the (simplified) code within code delimiters so that things line up nicely. Your going on your 120th post here, and I think by now you probably know about that. The more you help those who want to help you, the more they are able to help you.

    Comment

    Working...
    X