Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Also, here is an example using clogit. Dummy variable coding and effect coding produce identical fits. Again my guess is that Cordula is making a mistake somewhere (easy enough to do if you have 40 interactions). If you really really want effect coding then you may want to download the xi3 command.

    Code:
    . webuse lowbirth2, clear
    (Applied Logistic Regression, Hosmer & Lemeshow)
    
    . clogit low lwt  ptd ht ui i.smoke##i.race, group(pairid) nolog
    
    Conditional (fixed-effects) logistic regression
    
                                                    Number of obs     =        112
                                                    LR chi2(9)        =      26.34
                                                    Prob > chi2       =     0.0018
    Log likelihood = -25.643997                     Pseudo R2         =     0.3393
    
    ------------------------------------------------------------------------------
             low |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
    -------------+----------------------------------------------------------------
             lwt |   -.017926   .0101855    -1.76   0.078    -.0378892    .0020371
             ptd |   1.797776    .815141     2.21   0.027     .2001289    3.395423
              ht |   2.228638   1.149753     1.94   0.053    -.0248361    4.482111
              ui |   1.370714   .6920241     1.98   0.048     .0143716    2.727056
         1.smoke |   1.917216   1.168335     1.64   0.101    -.3726783     4.20711
                 |
            race |
          black  |   1.144002   1.263177     0.91   0.365    -1.331779    3.619783
          other  |   .2858877   .9932426     0.29   0.773    -1.660832    2.232607
                 |
      smoke#race |
        1#black  |  -.9362929   1.745752    -0.54   0.592    -4.357905    2.485319
        1#other  |   -.760635   2.037016    -0.37   0.709    -4.753113    3.231843
    ------------------------------------------------------------------------------
    
    . xi3: clogit low lwt  ptd ht ui e.smoke*e.race, group(pairid) nolog
    e.smoke           _Ismoke_0-1         (naturally coded; _Ismoke_0 omitted)
    e.race            _Irace_1-3          (naturally coded; _Irace_1 omitted)
    
    Conditional (fixed-effects) logistic regression
    
                                                    Number of obs     =        112
                                                    LR chi2(9)        =      26.34
                                                    Prob > chi2       =     0.0018
    Log likelihood = -25.643997                     Pseudo R2         =     0.3393
    
    ------------------------------------------------------------------------------
             low |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
    -------------+----------------------------------------------------------------
             lwt |   -.017926   .0101855    -1.76   0.078    -.0378892    .0020371
             ptd |   1.797776    .815141     2.21   0.027     .2001289    3.395423
              ht |   2.228638   1.149753     1.94   0.053    -.0248361    4.482111
              ui |   1.370714   .6920241     1.98   0.048     .0143716    2.727056
       _Ismoke_1 |   .6757866   .3292875     2.05   0.040      .030395    1.321178
        _Irace_2 |   .4820471   .5072945     0.95   0.342    -.5122319    1.476326
        _Irace_3 |  -.2882385   .5479771    -0.53   0.599    -1.362254    .7857769
       _Ism1Xra2 |  -.1853251   .4913708    -0.38   0.706    -1.148394     .777744
       _Ism1Xra3 |  -.0974962   .5772872    -0.17   0.866    -1.228958    1.033966
    ------------------------------------------------------------------------------
    
    .
    -------------------------------------------
    Richard Williams, Notre Dame Dept of Sociology
    StataNow Version: 19.5 MP (2 processor)

    EMAIL: [email protected]
    WWW: https://www3.nd.edu/~rwilliam

    Comment


    • #17
      Dear Richard and William,

      thanks a lot for your helpful input!

      I will try this out now und will give feedback on my results.
      I just install xi3 and will work on that.
      I also tried -fvvarlist- as in your last example, but it seems not to work with effects coding. I get the following error message: "factor variables may not contain negative values".
      xi3 does not seem to make problems in this regard.

      Comment


      • #18
        Just to supplement here, your cited paper by Kugler et al. (2012), consults effect coded data from studies which followed fractional fractorial design. The paper assumes that when your study adopted fractional factorial designs, by effect coding, you can estimate main fixed effects (equivalent to anova fixed effect), given, they are 'aliased' with 2nd order effects and any two-way/ three-way interactions are aliased with higher order effects. The estimable effects are designed at study design phase. If your study is not a fractional factorial deisgn, this may not be a suitable paper to follow, as William pointed out above, multiplying two effect code may be flawed if you are not careful to what you are doing. For example, those in -1 and -1, will get a +1 when multiplied, which is not true in your dataset.
        Roman

        Comment


        • #19
          Dear Roman,
          thanks a lot for your comment!
          It is indeed a fractional factorial design, but I did not design it, so I am not very sure if this has been considered in the design phase.

          Comment


          • #20
            So here are my results for using xi3 and dummy coding, which match the results from my previous idea to multiply the variables with each other.
            The second results block shows the results for effects coding though, which do not match the results from the multiplication. Moreover, there are interaction variables omitted.
            Code:
            Conditional (fixed-effects) logistic regression   Number of obs   =      15600
                                                              Wald chi2(38)   =     713.66
                                                              Prob > chi2     =     0.0000
            Log pseudolikelihood = -4627.5266                 Pseudo R2       =     0.1900
            
                                                 (Std. Err. adjusted for 325 clusters in ID)
            --------------------------------------------------------------------------------
                           |               Robust
              Choice_Dummy |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
            ---------------+----------------------------------------------------------------
                 optoutvar |   .9169907   .1529449     6.00   0.000     .6172242    1.216757
                 Pangasius |   .1841965   .0660292     2.79   0.005     .0547815    .3136114
                Bangladesh |  -.1587668   .0730642    -2.17   0.030      -.30197   -.0155636
                   Germany |    .747786   .0853922     8.76   0.000     .5804204    .9151516
                       ASC |   1.004653   .4099213     2.45   0.014     .2012218    1.808084
                 Naturland |   .6169105   .4266317     1.45   0.148    -.2192724    1.453093
                     Brand |   .0261319   .0428466     0.61   0.542    -.0578459    .1101097
                 FairLabel |   .3201833   .3811105     0.84   0.401    -.4267796    1.067146
                     Price |  -.1464912   .0131636   -11.13   0.000    -.1722913    -.120691
                  CETSCALE |          0  (omitted)
               _IGermany_1 |          0  (omitted)
                  _ICEXGe1 |  -.0197657   .0902066    -0.22   0.827    -.1965673    .1570359
             _IBanglades_1 |          0  (omitted)
                  _ICEXBa1 |  -.0777312   .0860788    -0.90   0.367    -.2464424    .0909801
                     CIBV1 |          0  (omitted)
                  _ICIXBa1 |   .0455831   .0788603     0.58   0.563    -.1089801    .2001464
                     CIBV2 |          0  (omitted)
                 _I1CIXBa1 |   .5792647   .0783303     7.40   0.000     .4257402    .7327893
                CI_Germany |          0  (omitted)
                  _ICIXGe1 |   .5842659   .0932755     6.26   0.000     .4014493    .7670826
              sust_concern |          0  (omitted)
                   _IASC_1 |          0  (omitted)
                  _IsuXAS1 |  -.1610985   .0759576    -2.12   0.034    -.3099727   -.0122243
             _INaturland_1 |          0  (omitted)
                  _IsuXNa1 |    -.17425   .0667061    -2.61   0.009    -.3049917   -.0435084
             _IFairLabel_1 |          0  (omitted)
                  _IsuXFa1 |  -.0381624   .0558967    -0.68   0.495     -.147718    .0713932
            Naturland_know |          0  (omitted)
                  _INaXNa1 |   .5511232    .167491     3.29   0.001      .222847    .8793995
             ASC_knowledge |          0  (omitted)
                  _IASXAS1 |   .0594404   .1492938     0.40   0.691      -.23317    .3520508
                 Fair_know |          0  (omitted)
                  _IFaXFa1 |   .2782418   .2925279     0.95   0.342    -.2951024    .8515859
                       age |          0  (omitted)
                  _IagXAS1 |  -.0134218   .0064565    -2.08   0.038    -.0260763   -.0007674
                  _IagXNa1 |   -.011124   .0063063    -1.76   0.078    -.0234842    .0012361
                  _IagXFa1 |  -.0019994   .0045573    -0.44   0.661    -.0109316    .0069328
                    female |          0  (omitted)
                  _IfeXAS1 |   .1067799   .1415325     0.75   0.451    -.1706187    .3841785
                  _IfeXNa1 |  -.0493308   .1458635    -0.34   0.735     -.335218    .2365565
                  _IfeXFa1 |    .000975   .1162149     0.01   0.993    -.2268021    .2287521
                    degree |          0  (omitted)
                  _IdeXAS1 |  -.2736318   .1462077    -1.87   0.061    -.5601936    .0129299
                  _IdeXNa1 |  -.1422414   .1402038    -1.01   0.310    -.4170358    .1325529
                  _IdeXFa1 |  -.1746861   .1205484    -1.45   0.147    -.4109567    .0615844
               high_income |          0  (omitted)
                  _IhiXAS1 |   .6154653   .2308042     2.67   0.008     .1630974    1.067833
                  _IhiXNa1 |   .4486477   .2539027     1.77   0.077    -.0489924    .9462878
                  _IhiXFa1 |   .1074731   .1618008     0.66   0.507    -.2096506    .4245967
             medium_income |          0  (omitted)
                  _ImeXAS1 |   .3184165   .2249913     1.42   0.157    -.1225584    .7593915
                  _ImeXNa1 |   .4487657   .2549927     1.76   0.078    -.0510108    .9485423
                  _ImeXFa1 |  -.1290489    .159808    -0.81   0.419    -.4422668    .1841691
                low_income |          0  (omitted)
                  _IloXAS1 |   .3187577   .2574398     1.24   0.216     -.185815    .8233305
                  _IloXNa1 |   .2676172   .2824475     0.95   0.343    -.2859698    .8212042
                  _IloXFa1 |   .0226336   .1916792     0.12   0.906    -.3530507    .3983178
            --------------------------------------------------------------------------------
            Code:
            Conditional (fixed-effects) logistic regression   Number of obs   =      15600
                                                              Wald chi2(52)   =     794.43
                                                              Prob > chi2     =     0.0000
            Log pseudolikelihood = -4535.4306                 Pseudo R2       =     0.2061
            
                                                 (Std. Err. adjusted for 325 clusters in ID)
            --------------------------------------------------------------------------------
                           |               Robust
              Choice_Dummy |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
            ---------------+----------------------------------------------------------------
                 optoutvar |  -.6943901   .3757697    -1.85   0.065    -1.430885     .042105
                 Pangasius |   .0912928   .0333422     2.74   0.006     .0259432    .1566424
                Bangladesh |  -.3400772    .044947    -7.57   0.000    -.4281716   -.2519828
                   Germany |   .5762999   .0520594    11.07   0.000     .4742653    .6783345
                       ASC |     .41867   .1681387     2.49   0.013     .0891241    .7482158
                 Naturland |    .250001   .1457855     1.71   0.086    -.0357335    .5357354
                     Brand |   .0132831   .0216861     0.61   0.540    -.0292209    .0557871
                 FairLabel |   .0648911   .1067234     0.61   0.543    -.1442828    .2740651
                     Price |  -.1470447   .0132501   -11.10   0.000    -.1730143    -.121075
                  CETSCALE |          0  (omitted)
               _IGermany_2 |          0  (omitted)
               _IGermany_3 |          0  (omitted)
                  _ICEXGe2 |  -.0528318   .0461784    -1.14   0.253    -.1433398    .0376763
                  _ICEXGe3 |   -.066348   .0930468    -0.71   0.476    -.2487163    .1160203
             _IBanglades_2 |          0  (omitted)
             _IBanglades_3 |          0  (omitted)
                  _ICEXBa2 |   .0730134   .1004248     0.73   0.467    -.1238157    .2698424
                  _ICEXBa3 |          0  (omitted)
                     CIBV1 |          0  (omitted)
                  _ICIXBa2 |  -.0400594   .0532513    -0.75   0.452      -.14443    .0643112
                  _ICIXBa3 |   .0059347   .0429349     0.14   0.890    -.0782161    .0900856
                     CIBV2 |          0  (omitted)
                 _I1CIXBa2 |  -.3585839   .0564153    -6.36   0.000    -.4691558   -.2480119
                 _I1CIXBa3 |   .2798877   .0440544     6.35   0.000     .1935427    .3662327
                CI_Germany |          0  (omitted)
                  _ICIXGe2 |  -.3628431   .0612579    -5.92   0.000    -.4829062   -.2427799
                  _ICIXGe3 |   .2691362   .0559045     4.81   0.000     .1595654    .3787071
              sust_concern |          0  (omitted)
                   _IASC_2 |          0  (omitted)
                   _IASC_3 |          0  (omitted)
                  _IsuXAS2 |  -.0208589   .0374398    -0.56   0.577    -.0942395    .0525217
                  _IsuXAS3 |  -.1459618   .1298444    -1.12   0.261    -.4004521    .1085286
             _INaturland_2 |          0  (omitted)
             _INaturland_3 |          0  (omitted)
                  _IsuXNa2 |   .1268654   .1229295     1.03   0.302     -.114072    .3678029
                  _IsuXNa3 |          0  (omitted)
             _IFairLabel_2 |          0  (omitted)
             _IFairLabel_3 |          0  (omitted)
                  _IsuXFa2 |   .0403018   .0495544     0.81   0.416     -.056823    .1374267
                  _IsuXFa3 |          0  (omitted)
            Naturland_know |          0  (omitted)
                  _INaXNa2 |  -.0540129   .0535044    -1.01   0.313    -.1588796    .0508538
                  _INaXNa3 |   .1848831   .0487736     3.79   0.000     .0892886    .2804776
             ASC_knowledge |          0  (omitted)
                  _IASXAS2 |   .0608079    .048589     1.25   0.211    -.0344247    .1560405
                  _IASXAS3 |   .0602842   .0405792     1.49   0.137    -.0192495    .1398179
                 Fair_know |          0  (omitted)
                  _IFaXFa2 |  -.0225302   .1124191    -0.20   0.841    -.2428676    .1978073
                  _IFaXFa3 |    .083848   .0634799     1.32   0.187    -.0405703    .2082663
                       age |          0  (omitted)
                  _IagXAS2 |   .0002514   .0033672     0.07   0.940    -.0063483     .006851
                  _IagXAS3 |  -.0076856   .0103092    -0.75   0.456    -.0278912    .0125201
                  _IagXNa2 |   .0064819   .0097489     0.66   0.506    -.0126257    .0255895
                  _IagXNa3 |          0  (omitted)
                  _IagXFa2 |   .0037477   .0044094     0.85   0.395    -.0048945      .01239
                  _IagXFa3 |          0  (omitted)
                    female |          0  (omitted)
                  _IfeXAS2 |  -.0232481   .0421483    -0.55   0.581    -.1058573    .0593611
                  _IfeXAS3 |   .0583612   .1287663     0.45   0.650     -.194016    .3107385
                  _IfeXNa2 |  -.0005147   .1258322    -0.00   0.997    -.2471413    .2461118
                  _IfeXNa3 |          0  (omitted)
                  _IfeXFa2 |   .0352629   .0541949     0.65   0.515    -.0709571    .1414829
                  _IfeXFa3 |          0  (omitted)
                    degree |          0  (omitted)
                  _IdeXAS2 |   .0554175   .0411081     1.35   0.178     -.025153    .1359879
                  _IdeXAS3 |  -.1262618   .1300025    -0.97   0.331     -.381062    .1285385
                  _IdeXNa2 |   .1208431   .1253005     0.96   0.335    -.1247413    .3664275
                  _IdeXNa3 |          0  (omitted)
                  _IdeXFa2 |   .0197423   .0583005     0.34   0.735    -.0945245    .1340091
                  _IdeXFa3 |          0  (omitted)
               high_income |          0  (omitted)
                  _IhiXAS2 |   .0286063   .0649587     0.44   0.660    -.0987105     .155923
                  _IhiXAS3 |   .3606554   .1963841     1.84   0.066    -.0242504    .7455611
                  _IhiXNa2 |  -.2428902   .1965705    -1.24   0.217    -.6281612    .1423808
                  _IhiXNa3 |          0  (omitted)
                  _IhiXFa2 |    .020506   .0718198     0.29   0.775    -.1202584    .1612703
                  _IhiXFa3 |          0  (omitted)
             medium_income |          0  (omitted)
                  _ImeXAS2 |   .0983636    .071469     1.38   0.169    -.0417132    .2384403
                  _ImeXAS3 |   .0859899   .1826656     0.47   0.638    -.2720281    .4440078
                  _ImeXNa2 |  -.0425245   .1862755    -0.23   0.819    -.4076177    .3225687
                  _ImeXNa3 |          0  (omitted)
                  _ImeXFa2 |  -.0709475   .0703346    -1.01   0.313    -.2088008    .0669058
                  _ImeXFa3 |          0  (omitted)
                low_income |          0  (omitted)
                  _IloXAS2 |   .0676129   .0751598     0.90   0.368    -.0796976    .2149234
                  _IloXAS3 |   .2470005    .215953     1.14   0.253    -.1762596    .6702606
                  _IloXNa2 |  -.1414138   .2159974    -0.65   0.513     -.564761    .2819333
                  _IloXNa3 |          0  (omitted)
                  _IloXFa2 |   .0399832   .0832019     0.48   0.631    -.1230894    .2030559
                  _IloXFa3 |          0  (omitted)
            --------------------------------------------------------------------------------

            Comment


            • #21
              Cordula, you should show the commands as well. The 2nd model has 14 more degrees of freedom but without seeing the commands it is hard to know why.
              -------------------------------------------
              Richard Williams, Notre Dame Dept of Sociology
              StataNow Version: 19.5 MP (2 processor)

              EMAIL: [email protected]
              WWW: https://www3.nd.edu/~rwilliam

              Comment


              • #22
                Also looking at your code it appears you tried to enter some variables twice, e.g. you have Germany and _IGermany_1. You don't show your code so I don't know exactly what you did but I suspect you have coding problems.
                -------------------------------------------
                Richard Williams, Notre Dame Dept of Sociology
                StataNow Version: 19.5 MP (2 processor)

                EMAIL: [email protected]
                WWW: https://www3.nd.edu/~rwilliam

                Comment


                • #23
                  Yes, I did not want to get those variables omitted (what happens if I do not enter them separately in addition to the interactions).
                  I input the following:

                  Code:
                  xi3: clogit Choice_Dummy optoutvar Pangasius Bangladesh Germany ASC Naturland Brand FairLabel Price $hypvarsxi3, group(Case) vce(cluster ID)
                  with
                  Code:
                  ***xi3 interaction variables (hypotheses only)
                  #delimit ;
                  global hypvarsxi3
                  CETSCALE*e.Germany
                  CETSCALE*e.Bangladesh
                  CIBV1*e.Bangladesh
                  CIBV2*e.Bangladesh
                  CI_Germany*e.Germany
                  sust_concern*e.ASC
                  sust_concern*e.Naturland
                  sust_concern*e.FairLabel
                  e.Naturland_know*e.Naturland
                  e.ASC_knowledge*e.ASC
                  e.Fair_know*e.FairLabel
                  age*e.ASC
                  age*e.Naturland
                  age*e.FairLabel
                  e.female*e.ASC
                  e.female*e.Naturland
                  e.female*e.FairLabel
                  e.degree*e.ASC
                  e.degree*e.Naturland
                  e.degree*e.FairLabel
                  e.high_income*e.ASC
                  e.high_income*e.Naturland
                  e.high_income*e.FairLabel
                  e.medium_income*e.ASC
                  e.medium_income*e.Naturland
                  e.medium_income*e.FairLabel
                  e.low_income*e.ASC
                  e.low_income*e.Naturland
                  e.low_income*e.FairLabel
                  ;
                  For the second example.
                  (For the first output, I did the same but replaced the -e- prefix with -i- and used the dummy coded versions of the variables.)

                  Comment


                  • #24
                    I suspect you are getting a mix of dummy coded variables with effect coded variables, e.g. Germany is coded 0/1 but then e.Germany gets coded -1/1. I'm not sure what the impact of that is but I imagine you should be consistent. I would make the first part of the command have e.Germany e.Bangladesh etc.

                    I am guessing too that a lot of these are not mutually exclusive, e.g. if You are a 1 on Germany you must be a 0 on Bangladesh; if you are a 1 on high_income you must be a zero on medium and low income. If so I would create and use variables like Country where 1 = Germany, 2 = Bangladesh, etc.; and Income where 1 = low_income, 2 = Moderate_income, etc.

                    Anyway, try the first suggestion first. i.e. put e. in front of every categorical variable. That may take care of the inconsistencies you are seeing. Or at least cut down on all those 0 (omitted) messages.
                    Last edited by Richard Williams; 26 Mar 2016, 10:15.
                    -------------------------------------------
                    Richard Williams, Notre Dame Dept of Sociology
                    StataNow Version: 19.5 MP (2 processor)

                    EMAIL: [email protected]
                    WWW: https://www3.nd.edu/~rwilliam

                    Comment


                    • #25
                      Also put i. in front of the categorical variables in the dummy variable coding. It could be that is the one that is wrong.
                      -------------------------------------------
                      Richard Williams, Notre Dame Dept of Sociology
                      StataNow Version: 19.5 MP (2 processor)

                      EMAIL: [email protected]
                      WWW: https://www3.nd.edu/~rwilliam

                      Comment


                      • #26
                        Thanks for your suggestions!
                        I think I did the -i- and -e- thing correctly (as I wrote above at the bottom of my post.)

                        For the other point: I did not know that it is possible to use e.g. a country variable which includes different levels for each country.
                        I thought I have to use dummy variables for each level of categorical variables.
                        Does this apply only to models where I only look at the main effects of variables?

                        Comment


                        • #27
                          You said your code was
                          Code:
                          xi3: clogit Choice_Dummy optoutvar Pangasius Bangladesh Germany ASC Naturland Brand FairLabel Price $hypvarsxi3, group(Case) vce(cluster ID)
                          You ought to be using things like i.Germany or e.Germany. The way you are doing it creates the possibility that dummy variable coding is used in some parts while effect coding is used in other parts.

                          How are your variables coded now? If you are already using effect coding (e.g. some vars are coded -1, 0, 1) then things may get screwed up as you try to effect code a variable that is already effect coded.

                          Yes, i. or e. create the dummies for you. So, i.Country would create, say, 4 dummies if Country had 5 possible values. Or else e.Country would create 4 effect coded variables. This should also generate the correct number of interactions. If you created a bunch of effect coded variables, I would try to go back to the earlier variables and use them. Stata is less likely to make mistakes than you are.

                          I'll also go back to the advice from way back when -- I am not sure this effect coding is worth it. I bet you could get what you want from using factor variable coding and margins. But, if you are going to use effect coding, you have to make sure you are doing it right.
                          -------------------------------------------
                          Richard Williams, Notre Dame Dept of Sociology
                          StataNow Version: 19.5 MP (2 processor)

                          EMAIL: [email protected]
                          WWW: https://www3.nd.edu/~rwilliam

                          Comment


                          • #28
                            Yes, the variables I assigned an -e- were already effect coded (and the variables I assigned an -i- were dummy coded. I worked with two datasets accordingly). I thought this has to be the case.
                            That was then a lack of understanding from my side, I though the declaration has to fit the coding, I did not see that the prefix actually does the coding for me.

                            I will give it another try now as you suggested with the taking the original variables and then doing the xi3 again.
                            I also would prefer dummy coding.

                            There are two reasons why I thought effect coding might be better for my case:
                            First, I would like to be able to interpete the alternative-specific constant of "opting out". (Background information: I did a discrete choice experiment in which participants could either select one of two product alternatives or decide to choose nothing at all. As a lot of people chose this "opt out option", I would like to evaluate it as a separate attribute, and dummy coding might be misleading in these situations, as described by Bech&Gyrd-Hansen (2005): "Effects coding in discrete choice experiments".)
                            The other reason is that I would like to be able to evaluate the coefficients (and later the willingness-to-pay) not relative to the base alternative but more to a zero alternative.
                            But I am not sure whether my understanding is correct here. For example, if my base alternative for the country is Vietnam, and I use dummy coding, then my coefficients for Bangladesh and Germany would be evaluated against Vietnam, and not more generally.
                            So I thought if I take dummy coding and estimate the WTP for the different product attributes, then the WTP for Bangladesh would mean the WTP for switching from Vietnam to Bangladesh.
                            And I thought if I take effects coding, the WTP for Bangladesh would mean a more absolute WTP, as compared to a zero mean (receiving nothing).
                            But I don't know if I got that wrong. Do you have an idea on that?

                            Comment


                            • #29
                              Ok, that probably explains a lot, including why the effect coding was estimating so many more parameters. Effect coding a variable coded -1, 0, 1 will create two variables from the original 1.

                              Use the data set where vars were dummy coded. If you use the effect coding commands, I suspect you'll find that effect coding and dummy variable coding produce identical results.

                              Better yet, if it exists, go back to the data set that had Country and Income as categorical vars and use them.

                              I don't use effect coding enough to have an opinion on its merits. I do know that commands like margins, contrast, and pwcompare let me do a lot of things and I doubt that effect coding could tell me much more. But get the effect coding right and then you can experiment from there.
                              -------------------------------------------
                              Richard Williams, Notre Dame Dept of Sociology
                              StataNow Version: 19.5 MP (2 processor)

                              EMAIL: [email protected]
                              WWW: https://www3.nd.edu/~rwilliam

                              Comment


                              • #30
                                Now that's unfortunately not working out as planned, as I have some interactions that don't fit together. (E.g. a variable I only want to interact with for example Bangladesh, but not with all the countries because it doesn't make sense).
                                I think I will keep the dummy solution I produced before which led to the same results as the "mulitiplication" solution. Hopefully this is not completely wrong. I am a bit confused about all that and hope I will manage to interpret the results.

                                Comment

                                Working...
                                X