Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • HELP my intercept becomes significant

    Hi all,

    I run 12 regressions on the bid premium in a Event Study. For the first 7 regressions, when adding the explanatory variables (after the control variables for the first regression), the constant or intercept stays insignificant and explains less after adding the variables. But from regression 7 forward, when adding variables which have a significant effect on the bid premium, the constant becomes significant as well.

    This feels very contra intuitive since the intercept (or constant) is the mean of the response when all predictors are zero.

    Does anyone has a logical explanation for this?? Or an article I could refer to??? One can find the snapshot in the attached file.

    Many thanks.
    Attached Files

  • #2
    You may thing about centering the variables, this way the intercept will reflect the predictors at the mean value. That being said, the constant shouldn't be a matter of much concern, since oftentimes zero values for the predictors tend to be an abstraction.
    Best regards,

    Marcos

    Comment


    • #3
      Originally posted by Marcos Almeida View Post
      You may thing about centering the variables, this way the intercept will reflect the predictors at the mean value. That being said, the constant shouldn't be a matter of much concern, since oftentimes zero values for the predictors tend to be an abstraction.
      Thanks. Any references that can underline this last statement?

      Comment


      • #4
        Originally posted by Joris Zee View Post

        Thanks. Any references that can underline this last statement?
        I can't speak for Marcos. However, in regression, you are testing each beta against the null hypothesis that the beta equals zero. The intercept is just one more beta. Whatever test is being applied to the intercept is also a test against that null hypothesis. You can see this in your own set of results; generally, the higher the value of the intercept, the more likely it has stars attached.

        The intercept is also irrelevant in most contexts. As Marcos said, it just represents the expected value of the dependent variable when every predictor is set to zero. Knowing if this expected value is zero or not is very unlikely to be meaningful.
        Be aware that it can be very hard to answer a question without sample data. You can use the dataex command for this. Type help dataex at the command line.

        When presenting code or results, please use the code delimiters format them. Use the # button on the formatting toolbar, between the " (double quote) and <> buttons.

        Comment


        • #5
          I believe Weiwen gave a fully clarifying reply.

          I just wish to underline that, in several fields - if not all of them - there shouldn't be a surprise to have the intercept different from zero.

          This is something found in any decent introduction to regression analysis, but let's use a toy example:

          A regression of weight (yvar) having (height) as a predictor. Surely we will get a positive coefficient. For each increase in, say, 1 meter, the weight will be, well, you name it. Guess what happen when the weight reaches (an impossible) zero vaue? Well, there will probably be a given height (including a significant p-value) for such bizarre creatures...
          Best regards,

          Marcos

          Comment

          Working...
          X