Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • How to interpret Prob>chi2 n.s.

    Dear all,

    I have checked the existing post on this issue and have still been unable to resolve my queries. Please could you help? I would really appreciate it.

    I have a query about interpreting logistic regression when the prob>chi2 is >.05.

    1. If in the simplest logistic model: logit y x, the prob>chi2 is >.05 i.e. non-significant,
    (a) does this mean that I cannot use this model?
    (b) could I make a judgement that x does NOT predict y? Or do I not have enough information to say that (absence of evidence vs evidence of absence)?

    2. If in a larger model: logit y x b, the the prob>chi2 BECOMES >.05 when I include the extra covariate ‘b’, but the basic model is <.05:
    (a) does this mean that I should remove ‘b’, and stick with the basic model where the prob>chi2 remains significant?
    (b) again, could I make a judgement that including ‘b’ means that the larger model does not account for changes in y? Or do I not have enough information to say that.

    Many thanks!

    Rebekah

  • #2
    The prob > chi2 statistic for the overall model is a test of the joint null hypothesis that all of the regression coefficients (other than the constant term) are zero. That is not the same as testing each coefficient separately. Unless your research question is specifically about the joint null hypothesis of the coefficients of x and b both being zero, the overall chi square for the model should be ignored. It's an answer to a question that is not being asked.

    In your original model with only x as a predictor, the p-value for the overall model (prob > chi2) will be approximately equal to the p-value shown for the coefficient of x. (In theory they should be identical, but sometimes there are some rounding issues.) But all that tells you is that your data are not able to give you a sufficiently precise estimate of the effect of x on y to determine whether it is positive or negative. That might be a sample size issue, or it might be due to noise in the data. Or it might be that the effect of x is just small; too small at least to be quantified with this kind of study. If you had a formal power analysis you might be able to tease out some of what's going on here, perhaps eliminating inadequate sample size as an explanation. But without that, you just can't say anything sharper than "the data could not identify a positive or negative effect."

    You cannot conclude that x has no effect on y. That is a very common mistake, but a mistake it is.

    Among the possible sources of noise in the analysis is uncontrolled variation due to other factors. Including those other factors (like variable b) in the model may serve to reduce that extraneous variation enough that the model can now give you a more precise estimate (at least precise enough to determine the direction of the effect) than you got with x alone.

    There is no universal agreement on how to decide which variables to include in a model. In my view, such decisions should be made before the data analysis, based on theory and perhaps on previous studies. I would never endorse removing a variable because it is "not significant," nor including one because it is significant. In exploratory studies aimed at developing a model for later testing, I might look to simplify a model by removing variables that don't make any real practical contribution to the predicted values of the outcome variable (because when you multiply the coefficient by the range of that variable you only get tiny adjustments to the predicted value), but, even in this case, I would ignore the p-value. But others do things differently.

    Comment


    • #3
      Thank you, Clyde. That was VERY helpful.

      Comment


      • #4
        The prob > chi2 statistic for the overall model is a test of the joint null hypothesis that all of the regression coefficients (other than the constant term) are zero. That is not the same as testing each coefficient separately. Unless your research question is specifically about the joint null hypothesis of the coefficients of x and b both being zero, the overall chi square for the model should be ignored. It's an answer to a question that is not being asked.
        I have another issue regarding to that: my model is showing a
        Wald chi2(3) = .
        and
        Prob > chi2 = .
        hower my results are looking good compared to other studies.

        How should I deal with something like that?

        I worked with a logit regression for panel data and the
        vce(cluster syear)
        –edit to the logit regression: for every binary independent I did the prefix i. for every for every metric and ordinal independent I did the prefix c. for easier interpretation–


        logit partyx i.EGP_0 i.EGP_I i.EGP_II i.EGP_IVabc i.EGP_VIIab i.EGP_IIIa i.EGP_IIIb i.EGP_VI i.poor c.inc5 i.prekaer i.unempl c.subjIn c.migra c.demo c.auth c.crime c.polst c.civi c.clima c.age c.educ i.sex i.region c.syear ,vce(cluster syear) or




        note: 1.EGP_VI omitted because of collinearity

        Iteration 0: log pseudolikelihood = -2738.4324

        Iteration 1: log pseudolikelihood = -2610.5889

        Iteration 2: log pseudolikelihood = -1866.7332

        Iteration 3: log pseudolikelihood = -1812.7968

        Iteration 4: log pseudolikelihood = -1812.0355

        Iteration 5: log pseudolikelihood = -1812.0348

        Iteration 6: log pseudolikelihood = -1812.0348




        Logistic regression Number of obs = 16,969

        Wald chi2(3) = .

        Prob > chi2 = .

        Log pseudolikelihood = -1812.0348 Pseudo R2 = 0.3383





        (Std. Err. adjusted for 4 clusters in syear)





        Code:
        * Example generated by -dataex-. To install: ssc install dataex
        
        clear
        
        input float(partyx EGP_0 EGP_I EGP_II EGP_IVabc EGP_VIIab EGP_IIIa EGP_IIIb EGP_VI poor inc5 prekaer unempl subjIn migra demo auth crime polst civi clima age educ sex region)
        
        . 0 0 0 0 0 0 1 0 0 2 0 0  6 2  7 3 3 5 2 2 63 2 1 0
        
        0 0 0 0 0 0 0 1 0 0 2 0 0  6 3  7 3 3 5 2 2 64 2 1 0
        
        0 0 0 0 0 0 0 1 0 0 2 0 0  5 2  6 4 2 5 3 3 65 2 1 0
        
        0 0 0 0 0 0 0 1 0 0 2 0 0  6 2  6 4 2 5 3 3 66 2 1 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  9 1  8 2 1 4 3 2 68 4 0 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  9 2  8 2 2 4 3 2 69 4 0 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0 10 3  8 1 2 4 4 2 70 4 0 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0 10 2  8 1 2 4 4 2 71 4 0 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  9 1  7 1 1 4 4 2 68 4 1 0
        
        0 0 0 1 0 0 0 0 0 0 4 0 0  6 1  7 1 1 4 4 2 69 4 1 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  8 2  8 1 1 4 3 2 70 4 1 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  7 2  8 1 2 4 3 2 71 4 1 0
        
        0 0 0 0 0 0 0 0 1 0 3 0 0  4 3  5 2 3 5 4 2 59 2 0 0
        
        . 0 0 0 0 0 0 0 1 0 2 0 0  4 3  5 2 2 5 4 2 60 2 0 0
        
        . 0 0 0 0 0 0 0 1 1 1 0 0  3 3  5 3 3 5 4 1 61 2 0 0
        
        0 0 0 0 0 0 0 0 1 0 3 0 0  2 3  5 3 3 5 4 2 62 2 0 0
        
        . 0 0 0 0 0 1 0 0 0 3 1 0  6 3  3 3 2 6 4 3 58 2 1 0
        
        . 0 0 0 0 0 1 0 0 0 2 1 0  5 3  3 3 3 6 4 3 59 2 1 0
        
        . 0 0 0 0 0 1 0 0 1 1 1 0  5 3  4 1 3 6 4 3 60 2 1 0
        
        . 0 0 0 0 0 1 0 0 0 3 1 0  5 3  4 1 3 6 4 3 61 2 1 0
        
        . 0 0 0 0 0 0 1 0 0 4 1 0  4 2  4 4 3 5 2 3 32 . 1 1
        
        . 0 0 0 0 0 0 1 0 0 5 1 0  4 3  4 4 3 5 2 3 33 . 1 1
        
        . 0 0 0 0 0 0 1 0 0 5 1 0  6 2  8 4 2 5 2 2 34 . 1 1
        
        . 0 0 1 0 0 0 0 0 0 3 1 0  8 1  8 4 3 5 2 3 35 . 1 1
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  7 1  3 1 2 4 4 1 45 3 0 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  5 1  3 1 2 4 4 1 46 3 0 0
        
        0 0 0 1 0 0 0 0 0 0 5 0 0  6 1  6 2 2 4 3 1 47 3 0 0
        
        0 0 1 0 0 0 0 0 0 0 5 0 0  6 1  6 2 2 4 3 1 48 3 0 0
        
        0 0 1 0 0 0 0 0 0 0 5 0 0  8 3  7 4 2 6 1 2 84 4 0 0
        
        0 0 1 0 0 0 0 0 0 0 5 0 0  9 2  7 4 2 6 1 2 85 4 0 0
        
        0 1 0 0 0 0 0 0 0 0 5 0 0  3 2  8 3 2 8 4 2 76 3 1 0
        
        0 1 0 0 0 0 0 0 0 0 5 0 0  7 2  8 3 2 8 4 2 77 3 1 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  8 3  5 1 3 7 3 2 76 2 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  7 3  5 1 3 7 3 3 77 2 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  6 3  6 2 3 7 3 2 78 2 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  8 3  6 2 3 7 3 3 79 2 0 0
        
        0 0 0 0 0 0 0 0 1 0 3 0 0  8 2  5 2 3 6 4 2 72 4 0 0
        
        0 0 0 0 0 0 0 0 1 0 3 0 0  7 2  5 2 3 6 4 2 73 4 0 0
        
        0 0 0 0 0 0 0 0 1 0 4 0 0  7 3  3 3 3 6 4 2 74 4 0 0
        
        0 0 0 0 0 0 0 0 1 0 3 0 0  7 3  3 3 3 6 4 2 75 4 0 0
        
        . 0 0 0 0 0 0 1 0 0 3 0 0  6 3  7 4 3 9 4 1 64 3 1 0
        
        . 0 0 0 0 0 0 1 0 0 3 0 0  6 3  7 4 3 9 4 1 65 3 1 0
        
        . 0 0 0 0 0 0 1 0 0 4 0 0  6 3  4 4 3 9 2 2 66 3 1 0
        
        . 0 0 0 0 0 0 1 0 0 3 0 0  7 3  4 4 3 9 2 2 67 3 1 0
        
        . 0 0 0 0 0 1 0 0 1 1 0 1  0 3  0 4 3 8 3 2 42 3 1 0
        
        . 0 0 0 0 0 1 0 0 0 2 0 0  0 3  0 4 3 8 3 2 43 3 1 0
        
        0 0 0 0 0 0 1 0 0 1 1 0 0  0 3  0 4 3 8 3 3 44 3 1 0
        
        0 0 0 0 0 1 0 0 0 0 5 0 0  5 1 10 2 1 7 4 2 77 2 0 0
        
        . 0 0 0 0 1 0 0 0 0 3 0 0 10 1 10 2 1 7 4 1 78 2 0 0
        
        0 0 0 0 0 0 0 0 1 0 3 0 0  7 3  5 2 2 5 4 2 50 2 0 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  8 3  5 2 3 5 4 2 51 2 0 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  6 3  8 4 3 5 1 3 52 2 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  8 1  5 3 1 3 3 2 54 3 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  7 1  5 3 1 3 3 2 55 3 0 0
        
        . 0 0 0 0 0 1 0 0 0 4 0 0  8 2  8 3 1 3 3 3 56 3 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  7 1  8 3 1 3 3 2 57 3 0 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  . 2  5 4 3 5 2 3 54 3 1 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  . 2  5 4 3 5 2 3 55 3 1 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  5 2  4 4 3 5 2 3 56 3 1 0
        
        0 0 0 0 0 0 1 0 0 0 4 0 0  3 3  4 4 3 5 2 3 57 3 1 0
        
        . . . . . . . . . . . . 1  . 3  . . . . . . 20 . . 0
        
        . . . . . . . . . . . . 1  . 3  . . . . . . 21 . . 0
        
        . . . . . . . . . . . . 1  . 3  . . . . . . 22 . . 0
        
        . . . . . . . . . . . . 1  . 3  . . . . . . 23 . . 0
        
        0 0 0 0 0 0 0 1 0 0 4 1 0  1 3  3 3 1 6 4 3 20 4 1 0
        
        . 0 0 0 0 0 0 1 0 0 4 0 0  . 3  3 3 1 6 4 3 21 4 1 0
        
        . 0 0 0 1 0 0 0 0 0 4 1 0  2 3  3 3 2 6 4 3 22 4 1 0
        
        . 0 0 0 1 0 0 0 0 0 4 1 0  1 3  3 3 1 6 4 1 23 4 1 0
        
        . 0 0 0 0 0 0 1 0 0 3 1 0  7 3  4 1 3 5 4 2 44 3 1 0
        
        . 0 0 1 0 0 0 0 0 0 3 1 0  8 3  4 1 3 5 4 2 45 3 1 0
        
        . 0 0 1 0 0 0 0 0 0 3 1 0  8 3  5 4 3 5 3 2 46 3 1 0
        
        . 0 0 1 0 0 0 0 0 0 2 1 0  7 3  5 4 3 5 3 2 47 3 1 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  7 3  4 2 3 4 4 2 65 3 0 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  6 3  4 2 3 4 4 3 66 3 0 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  6 3  7 4 3 4 3 3 67 3 0 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  7 3  7 4 3 4 3 3 68 3 0 0
        
        0 0 0 0 0 0 0 0 1 0 4 0 0  4 3  4 4 2 5 4 2 64 2 1 0
        
        0 0 0 0 0 0 0 0 1 0 4 0 0  5 3  4 4 2 5 4 1 65 2 1 0
        
        0 0 0 0 0 0 0 0 1 0 4 0 0  6 3  7 4 3 5 3 2 66 2 1 0
        
        . 0 0 0 0 0 0 0 1 0 4 0 0  5 3  7 4 . 5 3 . 67 2 1 0
        
        . 0 0 0 0 0 0 1 0 0 2 1 0  8 3  5 4 3 2 3 3 50 3 1 0
        
        . 0 0 0 0 0 0 1 0 0 2 1 0  7 2  5 4 3 2 3 2 51 3 1 0
        
        . 0 0 0 0 0 0 1 0 0 2 0 0  7 2  8 4 2 2 3 2 52 3 1 0
        
        . 0 0 0 0 0 0 1 0 0 2 1 0  8 2  8 4 3 2 3 3 53 3 1 0
        
        . 0 0 1 0 0 0 0 0 0 5 0 0  1 1  6 3 2 6 4 2 79 3 0 0
        
        . 0 0 1 0 0 0 0 0 0 5 0 0 10 1  6 3 2 6 4 2 80 3 0 0
        
        . 0 0 1 0 0 0 0 0 0 4 0 0  9 1  7 3 2 6 3 2 81 3 0 0
        
        . 0 0 1 0 0 0 0 0 0 2 0 0  7 2  7 3 3 6 3 2 82 3 0 0
        
        . 1 0 0 0 0 0 0 0 0 2 0 0  9 2  . . 3 . . 2 67 3 1 0
        
        0 0 0 0 0 1 0 0 0 0 2 0 0  7 3  6 4 3 5 3 3 66 2 0 0
        
        0 0 0 0 0 1 0 0 0 0 2 0 0  6 2  6 4 3 5 4 3 67 2 0 0
        
        . 0 0 0 0 1 0 0 0 0 2 0 0  7 3  6 4 3 5 4 3 68 2 0 0
        
        0 0 0 0 0 1 0 0 0 0 2 0 0  4 3  6 4 3 5 4 3 69 2 0 0
        
        0 0 0 0 0 0 0 1 0 0 2 0 0  7 3  6 3 3 5 4 3 61 2 1 0
        
        0 0 0 0 0 0 0 1 0 0 2 0 0  5 3  6 4 2 5 4 3 62 2 1 0
        
        . 0 0 0 0 0 0 1 0 0 2 0 0  5 3  5 4 3 5 4 3 63 2 1 0
        
        0 0 0 0 0 0 0 1 0 0 2 0 0  0 3  5 4 2 5 4 2 64 2 1 0
        
        . 0 0 1 0 0 0 0 0 0 3 0 0  8 2  5 2 3 5 3 2 68 2 0 0
        
        0 0 0 1 0 0 0 0 0 0 3 0 0  8 2  5 2 3 5 3 2 69 2 0 0
        
        . 0 0 1 0 0 0 0 0 0 3 0 0  8 3  6 4 2 5 2 2 70 2 0 0
        
        end
        
        label values subjIn plh0176
        
        label def plh0176 0 "[0] 0 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 1 "[1] 1 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 2 "[2] 2 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 3 "[3] 3 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 4 "[4] 4 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 5 "[5] 5 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 6 "[6] 6 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 7 "[7] 7 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 8 "[8] 8 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 9 "[9] 9 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label def plh0176 10 "[10] 10 Zufrieden: Skala 0-Niedrig bis 10-Hoch", modify
        
        label values migra zuwand
        
        label def zuwand 1 "[1] Keine Sorgen", modify
        
        label def zuwand 2 "[2] Einige Sorgen", modify
        
        label def zuwand 3 "[3] Grosse Sorgen", modify
        
        label values auth ruhe
        
        label def ruhe 1 "[1] Wichtigkeit: an 4. Stelle", modify
        
        label def ruhe 2 "[2] Wichtigkeit: an 3. Stelle", modify
        
        label def ruhe 3 "[3] Wichtigkeit: an 2. Stelle", modify
        
        label def ruhe 4 "[4] Wichtigkeit: an 1. Stelle", modify
        
        label values crime krimi
        
        label def krimi 1 "[1] Keine Sorgen", modify
        
        label def krimi 2 "[2] Einige Sorgen", modify
        
        label def krimi 3 "[3] Grosse Sorgen", modify
        
        label values polst plh0004
        
        label def plh0004 2 "[2] 2", modify
        
        label def plh0004 3 "[3] 3", modify
        
        label def plh0004 4 "[4] 4", modify
        
        label def plh0004 5 "[5] 5", modify
        
        label def plh0004 6 "[6] 6", modify
        
        label def plh0004 7 "[7] 7", modify
        
        label def plh0004 8 "[8] 8", modify
        
        label def plh0004 9 "[9] 9", modify
        
        label values civi beinf
        
        label def beinf 1 "[1] Wichtigkeit: an 4. Stelle", modify
        
        label def beinf 2 "[2] Wichtigkeit: an 3. Stelle", modify
        
        label def beinf 3 "[3] Wichtigkeit: an 2. Stelle", modify
        
        label def beinf 4 "[4] Wichtigkeit: an 1. Stelle", modify
        
        label values clima klima
        
        label def klima 1 "[1] Keine Sorgen", modify
        
        label def klima 2 "[2] Einige Sorgen", modify
        
        label def klima 3 "[3] Grosse Sorgen", modify
        
        label values educ bild
        
        label def bild 2 "[2] Haupt/Volksschule", modify
        
        label def bild 3 "[3] Realschulabschluss", modify
        
        label def bild 4 "[4] (Fach-)Abitur", modify
        
        label values sex sex
        
        label def sex 0 "Männer", modify
        
        label def sex 1 "Frauen", modify
        
        label values region region
        
        label def region 0 "Deutschland | Westen", modify
        
        label def region 1 "Deutschland | Osten", modify






        Comment


        • #5
          Christoph:
          you may want to take a look at:
          -help j_robustsingular-
          Kind regards,
          Carlo
          (Stata 19.0)

          Comment


          • #6
            Thanks a lot Carlo! It already helped.

            Just one additional question: Is there any guideline for choosing the cluster variable? I did it more or less intuitive which is clearly an inappropriate way. (My panal data has 4 time periods so I expected 4 clusters...) It would be really nice if you or someone else could give me some recommended reading.

            Best Chris

            Comment


            • #7
              Christoph:
              usually, standard errors are clustered on -panelid- (or, sometimes, on other predictors, if feasible) rather than time variable.
              Time variable can be included in the right-hand side of your regerssion equation usually as a categorical predictor:
              Code:
              i.year
              That said, and underlining that I would not cluster on timevar, 4 clusters are absolutely not enough to make clustered standard errors more efficient than their default counterparts.
              One of the paramount source on this topic is: http://cameron.econ.ucdavis.edu/rese...5_February.pdf.
              Kind regards,
              Carlo
              (Stata 19.0)

              Comment

              Working...
              X