Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Time series regression with static control variables

    Hey guys!

    I got a problem running a simple regression with time series data and I would be really grateful for some help. I have daily return data from 2001 until 2016 and I'm running my regression with 4 independent variables, for which I got data for the hole period. Until this point there haven't been any problems.

    But in order to improve the accuracy of my regression I would like to add some more control variables as independent variables. Unfortunately these potential control variables are all static, so I just got 1 observation for each at all. If I assume they would not change over time and all stay the same (so I would have the same value for every day in the period) and run the regression, Stata tells me that they are "omitted because of collinearity", which somehow makes sense of course. I've been searching a lot for any potential solution, but could not find anything satisfying, just a lot about this problem in connection with dummy variables, which is not appropriate in my case.

    Does anyone of you have an idea for solving this problem? Is my approach practicable at all using time series data?

    Thanks a lot in advance!!

    Best wishes,

    Tobias



    Code:
    . regress v1 v2 v3 v4 v5 control1 control2
    note: control1 omitted because of collinearity
    note: control2 omitted because of collinearity
    
          Source |       SS       df       MS              Number of obs =    2830
    -------------+------------------------------           F(  4,  2825) =    0.46
           Model |  .000392727     4  .000098182           Prob > F      =  0.7685
        Residual |  .609087801  2825  .000215606           R-squared     =  0.0006
    -------------+------------------------------           Adj R-squared = -0.0008
           Total |  .609480528  2829   .00021544           Root MSE      =  .01468
    
    ------------------------------------------------------------------------------
              g1 |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
    -------------+----------------------------------------------------------------
             v2 |  -.0001389   .0003089    -0.45   0.653    -.0007446    .0004669
             v3 |   .0003323   .0007064     0.47   0.638    -.0010529    .0017174
             v4 |   .0007024   .0006879     1.02   0.307    -.0006464    .0020512
             v5 |  -.0000796   .0003927    -0.20   0.839    -.0008497    .0006904
             control1 |          0  (omitted)
             control2 |          0  (omitted)
           _cons |  -.0045068   .0002765   -16.30   0.000    -.0050489   -.0039647
    ------------------------------------------------------------------------------

  • #2
    If your question is about how you can solve the multicollinearity problem then you could try to see if you can get more data and maybe increasing the sample size would provide you with some variation in the variables that you could exploit.

    Comment


    • #3
      Dear Tobias,
      How did you finally solve the collinearity problems in your regression model ? I have the same problem like you.
      best

      Comment

      Working...
      X