Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Margins in Latent Class Analysis (LCA) of a Discrete Choice Experiment (DCE) using GLLAMM

    Hi, I have a DCE dataset, where respondents chose eight times between two options. Each option had six attributes with 3 attribute levels. Five of these 6 are effects-coded and the last one is dummy-coded. I ran lclogit with 4 classes, which gives me the following output:

    Code:
    lclogit choice ch1_eff1 ch1_eff2 ch2_eff1 ch2_eff2 ch3_eff1 ch3_eff2 ch4b_eff1 ch4b_eff2 ch5c_eff1 ch5c_eff2 ch6_priv_dummy ch6_pub_dummy, id(id) group(cc2) nclasses(4)
    Code:
    -------------------------------------------------
        Variable |  Class1   Class2   Class3   Class4
    -------------+-----------------------------------
        ch1_eff1 |  -1.219   -0.828   -0.536   -0.004
        ch1_eff2 |   0.183    1.028    0.065   -0.092
        ch2_eff1 |  -2.243   -0.673   -2.235    0.072
        ch2_eff2 |   2.131    0.310    1.953   -0.204
        ch3_eff1 |  -2.576   -1.006   -0.959    0.285
        ch3_eff2 |   2.840    0.635    0.471    0.330
       ch4b_eff1 |  -0.229   -0.466   -0.653   -0.016
       ch4b_eff2 |   1.043    0.237    0.500   -0.154
       ch5c_eff1 |  -1.946    0.236   -0.548   -0.918
       ch5c_eff2 |   1.419   -0.135    0.534   -0.163
    ch6_priv_d~y |  -1.562   -0.044   -0.087   -0.410
    ch6_pub_du~y |   0.507   -1.350   -0.251    0.813
    -------------+-----------------------------------
     Class Share |   0.325    0.380    0.179    0.115
    -------------------------------------------------
    I then used lclogitml, which initialises gllamm and gives the following results for class 1:

    Code:
    --------------------------------------------------------------------------------
            choice | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
    ---------------+----------------------------------------------------------------
    choice1        |
          ch1_eff1 |   -1.21861   .2341479    -5.20   0.000    -1.677531   -.7596882
          ch1_eff2 |   .1832757   .0839117     2.18   0.029     .0188117    .3477396
          ch2_eff1 |  -2.242841   .3223866    -6.96   0.000    -2.874707   -1.610975
          ch2_eff2 |    2.13119   .3397402     6.27   0.000     1.465311    2.797068
          ch3_eff1 |  -2.575844   .2938692    -8.77   0.000    -3.151817   -1.999871
          ch3_eff2 |   2.839723   .3385522     8.39   0.000     2.176173    3.503273
         ch4b_eff1 |  -.2288113   .1535203    -1.49   0.136    -.5297055     .072083
         ch4b_eff2 |   1.043092   .2627628     3.97   0.000     .5280866    1.558098
         ch5c_eff1 |  -1.945933   .3752163    -5.19   0.000    -2.681344   -1.210523
         ch5c_eff2 |   1.418596   .3618886     3.92   0.000     .7093076    2.127885
    ch6_priv_dummy |  -1.562217    .212733    -7.34   0.000    -1.979166   -1.145268
     ch6_pub_dummy |   .5069582   .1806134     2.81   0.005     .1529624     .860954
    ---------------+----------------------------------------------------------------

    Now, what I want is simply to get marginal effects / predicted probabilities for the coefficients (which I would get in simpler models such as logit via "margins, dydx(*)" ).

    I passed the data to gllamm via "lclogitml, switch", and tried various things with "gllapred" and "gllasim". However, I can't figure out this rather simple thing.
Working...
X