Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Generating an hour of a day variable in stata

    Dear statalist,

    I have hourly panel data and want to generate an hour of a day for further analysis. From stata help file and other sources, I learn that the hh stata function does this.

    But I did not get success.

    To be precise, to get the months of the year, I use the following code, and went well.

    Code:
    gen month_num =month(mdy(month, day, year))
    In the same procedure, I tried the following to get the hours

    Code:
    gen hour_num =hh(mdyhms(month, day, year, hour))
    but in this case, the hour variable is without minutes and seconds. it is just hour 0......23.

    But stata says invalid syntax.

    Could anyone help me out, please?

    Kindest regards,
    Fissha


  • #2
    Why isn't hour already what you need?

    Note that mdyhms() wants 6 arguments and won't tolerate incompleteness.

    No data example here, contrary to our request (FAQ Advice #12), but here is some technique.


    .
    Code:
    * Example generated by -dataex-. To install: ssc install dataex
    clear
    input float(year month day hour)
    2021 2 1  9
    2021 2 1 10
    2021 2 1 11
    2021 2 1 12
    2021 2 1 13
    end
    
    gen double when = mdyhms(month, day, year, hour, 0, 0)
    
    format when %tc
    
    gen wanted = hh(when)
    
    . l
    
         +---------------------------------------------------------+
         | year   month   day   hour                 when   wanted |
         |---------------------------------------------------------|
      1. | 2021       2     1      9   01feb2021 09:00:00        9 |
      2. | 2021       2     1     10   01feb2021 10:00:00       10 |
      3. | 2021       2     1     11   01feb2021 11:00:00       11 |
      4. | 2021       2     1     12   01feb2021 12:00:00       12 |
      5. | 2021       2     1     13   01feb2021 13:00:00       13 |
         |---------------------------------------------------------|

    Comment


    • #3
      You can extract the minutes and seconds in the same way as you are extracting hour. The following 2 dates have equivalent times of the day.

      Code:
      di  %tCHH:MM:SS hms(hh(tc(29March2019 20:22:31)), mm(tc(29March2019 20:22:31)), ss(tc(29March2019 20:22:31)))
      di  %tCHH:MM:SS hms(hh(tc(2February2021 20:22:31)), mm(tc(2February2021 20:22:31)), ss(tc(29March2019 20:22:31)))
      Res.:

      Code:
      . di  %tCHH:MM:SS hms(hh(tc(29March2019 20:22:31)), mm(tc(29March2019 20:22:31)), ss(tc(29March2019
      >  20:22:31)))
      20:22:31
      
      . 
      . di  %tCHH:MM:SS hms(hh(tc(2February2021 20:22:31)), mm(tc(2February2021 20:22:31)), ss(tc(29March
      > 2019 20:22:31)))
      20:22:31
      Last edited by Andrew Musau; 01 Feb 2021, 06:51.

      Comment


      • #4
        Many thanks, Nick and Andrew. Now it works. @ Nick, I am sorry for my failure to post sample data. I will do it in my future quests.

        Comment


        • #5
          Good that you solved your problem. but it's hard to know what it was. On the evidence of #1 hour_num will just return whatever you fed in as the variable hour.

          Comment


          • #6
            Thank you so much, Nick. This was the thing. I want to estimate an hourly average treatment effect from a DID model and present it using a graph. For that, I need to first identify the number of hourly observations in each year month, and date before and after the treatment. I went through the following steps.

            First, I run the below code.

            Code:
            gen Nr_hour = mdyhms(month, day, year, hour, 0, 0)
            format Nr_hour %tc 
            gen hhnum =hh(Nr_hour)
            sort POST_treat hhnum
            egen allhour =group(POST_treat hhnum)
            tab allhour, gen(HOUR)
            After running this code I obtain 48 dummy variables for each hour, 24-hour variables before the treatment, and 24-hour variables after the treatment.

            Then I proceeded with the following code to generate and plot the ATEs.

            Code:
            preserve
            forvalues i = 25(1)48 { 
            gen TREAT_POST_`i'=0
            replace TREAT_POST_`i'=el_TREAT*HOUR`i'
             }
            drop if wt_TREAT==1
            egen new_hour = group(year month day hour)
            xtset newid new_hour
            xtreg daily_el TREAT_POST_* POST_treat i.hour daily_temp  i.month i.dow , fe cl(newid)
            parmest,label list(parm label estimate min* max* p) saving(results, replace)
            use results, clear
            drop in 25/73
            encode parm, generate(parm2)
            drop parm
            rename parm2 parm
            drop label
            gen Hour=0
            replace Hour =_n
            label variable estimate "ATE"
            label variable min95 "lower 95% CI "
            label variable max95 " upper 95% CI"
            two rarea  min95 max95 Hour || scatter estimate Hour, connect(l)
            restore

            Here is a sample data.
            Code:
            * Example generated by -dataex-. To install: ssc install dataex
            clear
            input int year byte(month day hour) float(newid electricity POST_treat el_TREAT dow daily_temp)
            2016 11 21 15   4  .497 1 1 1    22.375
            2016  2  6 13   7  .146 0 0 6   22.5625
            2016  3  2 14   9  .233 1 0 3 22.945833
            2017  2 15 16  10  .218 1 1 3   22.4125
            2017  4 30  2  10  .055 1 1 0 22.479166
            2015  7 20  6  14   .06 0 0 1  26.46667
            2015  7  5 12  24  .067 0 1 0  24.05833
            2016  1 28 20  24  .182 0 1 4  22.06667
            2016  5 29 21  30  .087 1 1 0 23.916666
            2016 12 22 16  33  .182 1 1 4      22.2
            2015  4 28 21  35  .159 0 1 2 23.366667
            2015  3 13  6  40  .089 0 1 5  21.68333
            2017  1  2 21  41  .124 1 1 1 24.204166
            2017  1 13 16  47     0 1 1 5 23.366667
            2016  9  7 13  55  .085 1 1 3   17.4375
            2016  7 13 16  67  .114 1 0 3    23.975
            2016  2 10  3  68  .097 0 1 3   21.7625
            2015 12  9  8  74  .174 0 0 3  21.27917
            2017  5  7  2  80  .051 1 0 0    22.275
            2016  3 12 17  81  .054 1 1 6    22.125
            2016 10 26  7  82  .089 1 1 3 22.695833
            2016 11 22 12  94  .194 1 1 2 21.329166
            2016 12 25 23 102  .055 1 0 0  7.758333
            2015  4 23  6 120  .046 0 0 4  22.15417
            2015 12 26  6 124  .112 0 1 6    20.875
            2016 12 22  3 141   .39 1 0 4      22.2
            2016  1 20 14 148  .049 0 0 3    20.625
            2016  8 15  5 153  .046 1 0 1 21.170834
            2016 11 27 10 156  .564 1 1 0   20.1375
            2016  7 16  5 158  .092 1 0 6   25.0875
            2017  5  3  9 162  .052 1 0 3   23.3875
            2016 11  7  4 169  .088 1 1 1  22.03333
            2016  6  9 15 171  .072 1 0 4  23.30833
            2015  7 27  9 176  .085 0 0 1   24.0875
            2015 10  3  9 184   .07 0 0 6  21.78333
            2015  8  8 23 191  .366 0 0 6  24.40833
            2015  6 13 20 192  .072 0 0 6        23
            2015  8 13 12 193  .058 0 0 4     24.35
            2015 12  6  3 195  .066 0 0 0 21.666666
            2016 11 13 16 196  .353 1 1 0 22.416666
            2015  4 16 17 197 1.194 0 0 4 23.391666
            2015 11  2 14 202  .263 0 1 1    23.775
            2015  5 27  7 213  .318 0 0 3  23.34583
            2016  1  7  0 214  .183 0 0 4    22.225
            2016  9 21 16 214   .12 1 0 3      23.1
            2015  8 10  8 219     0 0 0 1 23.304167
            2016  9 17 17 221  .082 1 0 6 23.383333
            2015  4 20 16 231  .043 0 1 1  22.90417
            2016  9 11  3 235  .064 1 0 0    23.675
            2015  5 12  3 240  .068 0 1 2 23.858334
            2016  2  6 15 256  .069 0 1 6      22.3
            2015  8 15 16 265  .082 0 0 6   23.2625
            2015  6 30  3 274  .029 0 0 2    23.875
            2015  5 19 12 276  .045 0 0 2  22.30833
            2017  1  3  6 277   .06 1 0 2 21.483334
            2017  5  2  9 290  .137 1 0 2 22.641666
            2017  4 28  1 291  .084 1 0 5 23.479166
            2016 11 27 17 296  .184 1 0 0    20.775
            2016 10 27 23 301  .116 1 0 4  21.65833
            2016  6  2 14 302  .041 1 0 4  25.21667
            2016  6 27  6 309  .197 1 0 1  25.46667
            2016  4 19 20 315  .206 1 0 2   23.1875
            2016 10 24  6 315  .214 1 0 1  22.80833
            2016  1 24  5 316  .033 0 0 0  21.40833
            2016 11  7 17 323  .039 1 0 1  19.90833
            2016 10 14 17 326  1.18 1 0 5 22.916666
            2016  9  1  1 327  .085 1 0 4 22.395834
            2017  5  8 23 333  .103 1 0 1 23.645834
            2015  9 14  6 343  .042 0 0 1 23.304167
            2017  1  1  7 348  .043 1 0 0      20.9
            2015  9  5 14 352  .038 0 0 6 22.983334
            2017  1 13 14 354  .822 1 0 5  22.93333
            2016 10  6  1 370  .016 1 0 4     23.55
            2015 11  9 22 371     0 0 0 1  21.94167
            2015 12  7 19 374  .591 0 0 1 21.895834
            2016  6 30 12 393  .186 1 0 4  24.19167
            2015  8  4  4 406     0 0 0 2 23.866667
            2015 10 13  9 408  .037 0 0 2    22.875
            2015  3 16  0 410     0 0 0 1 20.195833
            2016  5 28 23 415  .436 1 0 6      21.8
            2015  6 25 17 426     0 0 0 4 23.741667
            2016 10 29  7 426  .095 1 0 6 21.479166
            2015  4 21  1 429     0 0 0 2 21.291666
            2016  8  1  2 433  .077 1 0 1 25.141666
            2017  2  5 16 440    .2 1 0 0 23.229166
            2015 12 27 12 445  .471 0 0 0   21.2125
            2016  3 28  6 454  .053 1 0 1 24.883333
            2017  2 26 12 462  .493 1 0 0  21.15833
            2017  5 10 18 471  .094 1 0 3 22.458334
            2017  5 19 15 472  .032 1 0 5  23.31667
            2015 11  3 22 484  .034 0 0 2      20.8
            2015  7 20  5 485     0 0 0 1       6.5
            2016  3 17  2 486  .049 1 0 4    22.175
            2017  2 15  7 492  .107 1 0 3 22.333334
            2017  5 14 22 492  .862 1 0 0   21.0375
            2016  2 19  5 494   .08 0 0 5    19.675
            2015  8 19 20 496     0 0 0 3  24.75417
            2016  1 25 23 501  .504 0 0 1 22.045834
            2016  2 24 18 507  .327 0 0 3  20.28333
            2015 12 24  4 521     0 0 0 4  20.49583
            end

            But still, I am not sure if this is the correct way to do it or not. Could you help me out? Many thanks!

            Comment

            Working...
            X