Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Multilevel modeling not converging and stuck on iteration loop

    Hey everyone,

    I am currently in the process of writing my master's thesis, and after some new information from my supervisor, I decided I had to switch my model to multilevel modeling, but after trying it out and playing around with various options these past few days, I still did not manage to get it running. I tried it with fewer predictors, different slopes, and many different options specified, but I just do not understand why it will not converge, hence why I am reaching out here.

    My research hypothesis:
    H1: Countries with higher country-level Charismatic/Value-Based Leadership scores demonstrate a less severe decrease in hotel industry revenue during the crisis than during the pre-crisis period.
    H2: Countries with higher country-level Charismatic/Value-Based Leadership scores demonstrate a less severe decrease in the current ratio of the hotel industry during the crisis than during the pre-crisis period.

    Variables:
    IV: Charismatic/value-based leadership / Team-oriented Leadership / participative leadership
    DV: ln_Current / ln_Revenue
    Controls: i.crisis (specifying crisis / no crisis) HHI (competition) Political_Stability GDPG Inflation ln_GDP ln_Assets
    Interaction effect: c.CharismaticValuebased#i.crisis (this is my main interest given my hypothesis, similar for other ivs)

    Relevant information:
    The variables ln_Current and ln_Revenue and ln_Assets are on the individual hotel level, with the other variables being at the country-level.


    Current code example:
    Code:
    mixed ln_Revenue CharismaticValuebased i.crisis c.CharismaticValuebased#i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets || Country: , mle cov(ind) vce(cluster Country)
    output:
    Code:
    . mixed ln_Revenue CharismaticValuebased i.crisis c.CharismaticValuebased#i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets || Country:
    >  , mle cov(ind) vce(cluster Country)
    note: single-variable random-effects specification in Country equation; covariance structure set to identity.
    
    Performing EM optimization ...
    
    Performing gradient-based optimization:
    Iteration 0:  Log pseudolikelihood = -687343.55  
    Iteration 1:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 2:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 3:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 4:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 5:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 6:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 7:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 8:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 9:  Log pseudolikelihood = -687343.55  (backed up)
    Iteration 10: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 11: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 12: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 13: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 14: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 15: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 16: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 17: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 18: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 19: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 20: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 21: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 22: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 23: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 24: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 25: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 26: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 27: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 28: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 29: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 30: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 31: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 32: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 33: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 34: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 35: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 36: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 37: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 38: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 39: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 40: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 41: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 42: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 43: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 44: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 45: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 46: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 47: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 48: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 49: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 50: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 51: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 52: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 53: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 54: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 55: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 56: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 57: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 58: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 59: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 60: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 61: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 62: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 63: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 64: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 65: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 66: Log pseudolikelihood = -687343.55  (backed up)
    Iteration 67: Log pseudolikelihood = -687343.55  (backed up)
    --Break--
    r(1);
    I stopped it since it was quite a while and did not converge once again. I can get the following to run, but that eliminates me differentiating between countries, which is suboptimal.

    Code:
    mixed ln_Revenue c.CharismaticValuebased i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets
    Code:
    mixed ln_Revenue c.CharismaticValuebased i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets
    
    Mixed-effects ML regression                          Number of obs =   354,283
                                                         Wald chi2(8)  = 308020.07
    Log likelihood = -693782.15                          Prob > chi2   =    0.0000
    
    ---------------------------------------------------------------------------------------
               ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
    ----------------------+----------------------------------------------------------------
    CharismaticValuebased |  -.0793954   .0094251    -8.42   0.000    -.0978683   -.0609225
                 1.crisis |  -.5725233   .0066887   -85.60   0.000    -.5856328   -.5594137
                      HHI |   .0006529   .0000228    28.59   0.000     .0006081    .0006977
                Political |   .4241518   .0053632    79.09   0.000     .4136402    .4346635
                     GDPG |   .0134702   .0007447    18.09   0.000     .0120105    .0149299
                Inflation |   .0736891   .0010745    68.58   0.000     .0715831    .0757951
                   ln_GDP |   .3808357   .0035189   108.22   0.000     .3739387    .3877327
                ln_Assets |   .6435774   .0013627   472.28   0.000     .6409066    .6462483
                    _cons |  -6.138186   .1259668   -48.73   0.000    -6.385076   -5.891295
    ---------------------------------------------------------------------------------------
    
    ------------------------------------------------------------------------------
      Random-effects parameters  |   Estimate   Std. err.     [95% conf. interval]
    -----------------------------+------------------------------------------------
                   var(Residual) |   2.940751   .0069871      2.927089    2.954478
    ------------------------------------------------------------------------------


    Any advice on how I can keep the random effects slope and get it to converge would be highly appreciated. Thanks for taking the time in advance!

    Last edited by Jesse Nooijen; 20 Dec 2023, 12:38.

  • #2
    There are a few reasons your HLM model might not converge. You could be using more degrees of freedom than your data can support (cov(ind) could be the culprit there) or your model could be misspecified in some other way. I see you have a relatively large N, but how many countries do you have an how many observations do you have within each countries? If either number is too small, but especially if you have very few observations within countries, that could be your issue.

    Can you also estimate this model:

    Code:
    mixed ln_Revenue || Country:
    and post the results? If the variance of the constant is much lower than the variance of the residual, it might be that country-level context doesn't explain all that much variation in your outcome.

    Comment


    • #3
      Originally posted by Daniel Schaefer View Post
      There are a few reasons your HLM model might not converge. You could be using more degrees of freedom than your data can support (cov(ind) could be the culprit there) or your model could be misspecified in some other way. I see you have a relatively large N, but how many countries do you have an how many observations do you have within each countries? If either number is too small, but especially if you have very few observations within countries, that could be your issue.
      I have 25 countries included in my model and a total of 366.663 observations. See tabulated list below
      Code:
                              Country |      Freq.     Percent        Cum.
      ----------------------------------------+-----------------------------------
                                    Australia |        901        0.25        0.25
                                      Austria |        959        0.26        0.51
                                        China |      1,341        0.37        0.87
                                     Colombia |     11,441        3.12        3.99
                               Czech Republic |      1,932        0.53        4.52
                                      Finland |      4,320        1.18        5.70
                                       France |     38,948       10.62       16.32
                                      Germany |      3,692        1.01       17.33
                                       Greece |     18,058        4.92       22.25
                                      Hungary |     10,498        2.86       25.12
                                        Italy |     68,471       18.67       43.79
                                        Japan |      3,520        0.96       44.75
                                     Malaysia |      6,030        1.64       46.39
                                      Morocco |      5,556        1.52       47.91
                                  Philippines |        875        0.24       48.15
                                       Poland |     10,125        2.76       50.91
                                     Portugal |     18,311        4.99       55.90
                                       Russia |     36,740       10.02       65.92
                                    Singapore |      2,002        0.55       66.47
                                     Slovenia |      4,674        1.27       67.74
                                  South Korea |      4,324        1.18       68.92
                                        Spain |     50,402       13.75       82.67
                                       Sweden |     15,221        4.15       86.82
                                     Thailand |     38,921       10.61       97.44
                               United Kingdom |      9,401        2.56      100.00
      ----------------------------------------+-----------------------------------
                                        Total |    366,663      100.00


      Originally posted by Daniel Schaefer View Post
      Can you also estimate this model:

      Code:
      mixed ln_Revenue || Country:
      and post the results? If the variance of the constant is much lower than the variance of the residual, it might be that country-level context doesn't explain all that much variation in your outcome.
      I tried running it and this is the result:

      Code:
      . mixed ln_Revenue || Country:
      
      Performing EM optimization ...
      
      Performing gradient-based optimization:
      Iteration 0:  Log likelihood = -771948.38  
      Iteration 1:  Log likelihood = -771948.38  (backed up)
      Iteration 2:  Log likelihood = -771948.38  (backed up)
      Iteration 3:  Log likelihood = -771948.38  (backed up)
      Iteration 4:  Log likelihood = -771948.38  (backed up)
      Iteration 5:  Log likelihood = -771948.38  (backed up)
      Iteration 6:  Log likelihood = -771948.38  (backed up)
      Iteration 7:  Log likelihood = -771948.38  (backed up)
      Iteration 8:  Log likelihood = -771948.38  (backed up)
      Iteration 9:  Log likelihood = -771948.38  (backed up)
      Iteration 10: Log likelihood = -771948.38  (backed up)
      Iteration 11: Log likelihood = -771948.38  (backed up)
      Iteration 12: Log likelihood = -771948.38  (backed up)
      Iteration 13: Log likelihood = -771948.38  (backed up)
      Iteration 14: Log likelihood = -771948.38  (backed up)
      Iteration 15: Log likelihood = -771948.38  (backed up)
      Iteration 16: Log likelihood = -771948.38  (backed up)
      Iteration 17: Log likelihood = -771948.38  (backed up)
      Iteration 18: Log likelihood = -771948.38  (backed up)
      Iteration 19: Log likelihood = -771948.38  (backed up)
      Iteration 20: Log likelihood = -771948.38  (backed up)
      Iteration 21: Log likelihood = -771948.38  (backed up)
      Iteration 22: Log likelihood = -771948.38  (backed up)
      Iteration 23: Log likelihood = -771948.38  (backed up)
      Iteration 24: Log likelihood = -771948.38  (backed up)
      --Break--
      r(1);
      The model sadly did not converge either. Multilevel modeling is rather new to me, I am trying to find more relevant information I can use but I feel like I currently do not grasp the basics good enough to really pursue those other options. Would simply running the mlm model without specified options be an option worth using for me in your opinion?

      Sorry it took me a bit to reply, I had to leave home to prepare for Christmas and the family. I wish you a merry Christmas as well. Thanks so much for your reply and help!
      Last edited by Jesse Nooijen; 25 Dec 2023, 08:20.

      Comment


      • #4
        Thank you, and merry Christmas. The simple model really should have converged.

        Code:
        mixed ln_Revenue || Country:
        If I estimate a similar model in Stata 18 where y and c are unrelated, the model will still converge.

        Code:
        clear
        set obs 1000
        gen y = runiform()
        egen c = seq(), f(1) t(10)
        mixed y || c:
        Since the model didn't converge, the problem probably has something to do with the relationship between revenue and country. Are there many missing values on revenue, particularly within a single country? Did you have many revenue observations equal to zero before you took the natural log, and if so how did you handle those cases? It might be useful to see how many observations are in each country when ln_Revenue is not missing.

        Code:
        fre Country if !missing(ln_Revenue)

        Comment


        • #5
          To diagnose a convergence problem in a stripped-down model like this one, with just an outcome, no fixed-effect regressor, and a single random intercept level, you can fit the same model, using a different estimator, with -xtreg, re-.
          Code:
          xtset Country
          xtreg ln_revenue, re
          This generalized least squares estimator frequently converges where -mixed-'s maximum likelihood does not. In my experience, what you frequently find is that the estimated variance component at the Country level is zero. The maximum likelihood estimator in -mixed- is incapable of estimating such a model because in -mixed-, you do not estimate the variance component itself, but rather its logarithm. If the variance component is zero, the logarithm is negative infinity--hence the divergence of the estimation process.

          Now even -xtreg, re- cannot properly estimate a model in which the intraclass correlation is actually negative. Such situations are not common in the real world, but they do sometimes occur. And in -xtreg, re-, when confronted with such data, it returns a zero estimate.

          The other thing you can do is re-estimate the -mixed- model adding the -iterate()- option. Your model already gets backed up at iteration 1, so if we try setting a limit of, say, 10 iterations, we should get a clear picture of what is going on with the estimates without wasting too much time.
          Code:
          mixed ln_revenue || Country:, iterate(10)
          -mixed- will terminate at the 10th iteration and show you its interim results. Bear in mind that it still has not converged and these results are not valid estimates of the model parameters. But they show you where the iterations are heading. What I suspect you will see is that the estimate for the variance component is some number extremely close to zero, like maybe 1e-12. This is another way of confirming that the problem is absence of variation at the Country level here.

          Anyway, if you end up confirming the absence of variation at the Country level in these ways, the solution is to simply go to a one-level model, using -regress- instead of -mixed-.

          Another thing to check is this: if you only have one observation per Country (after eliminating observations that have missing values on model variables), then you do not have two levels in the model, and -mixed- will fail to converge in this situation as well because there is no way to distinguish the Country level from the residual level variation. -mixed- has no check for this built in and it simply iterates endlessly and fails to converge when that happens.

          All of that said, it is hard for me to believe that a variable that someone would choose to name ln_Revenue, regardless of whose revenue we are talking about, would be lacking in variance at the level of a variable someone would choose to call Country. So if these diagnostics confirm that the variance component at the Country level appears to be zero, I would be strongly suspicious that there is something seriously wrong with your data. As you have not so far shown any example data, there isn't anything more that to presently say about this.

          Comment


          • #6
            the estimated variance component at the Country level is zero
            Thanks, Clyde, for the excellent advice in #5. My intuition is that a variance component exactly equal to zero (or so close to zero that the difference is outside the precision of a floating point number) should be very unlikely. This might happen if the variance of my outcome were already small (depending on the units of my outcome), but if I saw something like this I would certainly be very suspicious that there was a problem with my data.

            Comment


            • #7
              Thanks so much for your reply, Daniel & Clyde; I ran the tests you recommended, and I'll post the results below.

              Originally posted by Daniel Schaefer View Post

              Code:
              clear
              set obs 1000
              gen y = runiform()
              egen c = seq(), f(1) t(10)
              mixed y || c:

              Since the model didn't converge, the problem probably has something to do with the relationship between revenue and country. Are there many missing values on revenue, particularly within a single country? Did you have many revenue observations equal to zero before you took the natural log, and if so how did you handle those cases? It might be useful to see how many observations are in each country when ln_Revenue is not missing.
              I ran the command for the simple model that should have converged for you and saw it indeed converged. For my data, I have very few observations where ln_Renue is missing. See results as per the provided code;
              Code:
              fre Country if !missing(ln_Revenue)
              
              
              
              Country -- Country
              --------------------------------------------------------------------
                                     |      Freq.    Percent      Valid       Cum.
              -----------------------+--------------------------------------------
              Valid   Australia      |        900       0.25       0.25       0.25
                      Austria        |        959       0.27       0.27       0.52
                      China          |       1341       0.38       0.38       0.90
                      Colombia       |      11441       3.23       3.23       4.13
                      Czech Republic |       1932       0.55       0.55       4.68
                      Finland        |       3868       1.09       1.09       5.77
                      France         |      37692      10.64      10.64      16.41
                      Germany        |       3692       1.04       1.04      17.45
                      Greece         |      16759       4.73       4.73      22.18
                      Hungary        |      10498       2.96       2.96      25.14
                      Italy          |      65676      18.54      18.54      43.68
                      Japan          |       3502       0.99       0.99      44.67
                      Malaysia       |       6030       1.70       1.70      46.37
                      Morocco        |       4274       1.21       1.21      47.58
                      Philippines    |        852       0.24       0.24      47.82
                      Poland         |      10125       2.86       2.86      50.68
                      Portugal       |      18311       5.17       5.17      55.85
                      Russia         |      32803       9.26       9.26      65.10
                      Singapore      |       1749       0.49       0.49      65.60
                      Slovenia       |       4556       1.29       1.29      66.88
                      South Korea    |       4324       1.22       1.22      68.10
                      Spain          |      50402      14.23      14.23      82.33
                      Sweden         |      14278       4.03       4.03      86.36
                      Thailand       |      38919      10.99      10.99      97.35
                      United Kingdom |       9400       2.65       2.65     100.00
                      Total          |     354283     100.00     100.00           
              --------------------------------------------------------------------
              I reduced my data set using;

              Code:
              drop if Revenue < 0
              egen zero_revenue_count = total(Revenue == 0), by(Company Year)
              egen all_years_zero = total(zero_revenue_count == _N), by(Company)
              count if all_years_zero == 1
              drop if all_years_zero == 1
              drop all_years_zero
              
              summarize Revenue, detail
              local Mean = r(mean)
              local StdDev = r(sd)
              drop if abs(Revenue - `Mean') > 3 * `StdDev'
              drop zero_revenue_count
              summarize Revenue, detail
              Which resulted in the following output;

              Code:
              . do "C:\Users\jesse\AppData\Local\Temp\STD6cc_000000.tmp"
              
              . drop if Revenue < 0
              (305 observations deleted)
              
              . egen zero_revenue_count = total(Revenue == 0), by(Company Year)
              
              . egen all_years_zero = total(zero_revenue_count == _N), by(Company)
              
              . count if all_years_zero == 1
                579
              
              . drop if all_years_zero == 1
              (579 observations deleted)
              
              . drop all_years_zero
              
              . 
              . summarize Revenue, detail
              
                                         Revenue
              -------------------------------------------------------------
                    Percentiles      Smallest
               1%            0              0
               5%     52.83822              0
              10%         4000              0       Obs             401,530
              25%        58298              0       Sum of wgt.     401,530
              
              50%     324371.7                      Mean            3285900
                                      Largest       Std. dev.      5.47e+07
              75%      1196543       5.73e+09
              90%      3611069       5.79e+09       Variance       3.00e+15
              95%      7372565       5.88e+09       Skewness       61.88277
              99%     3.25e+07       6.76e+09       Kurtosis        4847.49
              
              . local Mean = r(mean)
              
              . local StdDev = r(sd)
              
              . drop if abs(Revenue - `Mean') > 3 * `StdDev'
              (32,245 observations deleted)
              
              . drop zero_revenue_count
              
              . summarize Revenue, detail
              
                                         Revenue
              -------------------------------------------------------------
                    Percentiles      Smallest
               1%            0              0
               5%           50              0
              10%     3995.668              0       Obs             400,802
              25%     58043.68              0       Sum of wgt.     400,802
              
              50%     322703.7                      Mean            1856454
                                      Largest       Std. dev.       6921384
              75%      1187672       1.66e+08
              90%      3549979       1.66e+08       Variance       4.79e+13
              95%      7118836       1.66e+08       Skewness       11.11828
              99%     2.80e+07       1.66e+08       Kurtosis       172.3916
              
              . 
              end of do-file


              I initially ran my models with a fixed effects regar model to deal with autocorrelation, but I got feedback that I should use mlm given my country level leadership variable and hotel level financial performance variable. As extra information, I have a panel data setup from 2015-2022 with 2015-2019 as base years per crisis and 2020-2022 as crisis period. My original models looked like this;

              Code:
              xtregar ln_Revenue CharismaticValuebased c.CharismaticValuebased#i.crisis i.crisis HHI GDPG Inflation ln_Assets ln_GDP Political, fe
              est store regar1a
              xtregar ln_Current CharismaticValuebased c.CharismaticValuebased#i.crisis i.crisis HHI GDPG Inflation ln_Assets ln_GDP Political, fe
              est store regar1b
              xtregar ln_Revenue TeamOriented c.TeamOriented#i.crisis i.crisis HHI GDPG Inflation ln_Assets ln_GDP Political, fe
              est store regar2a
              xtregar ln_Current TeamOriented c.TeamOriented#i.crisis i.crisis HHI GDPG Inflation ln_Assets ln_GDP Political, fe
              est store regar2b
              xtregar ln_Revenue Participative c.Participative#i.crisis i.crisis HHI GDPG Inflation ln_Assets ln_GDP Political, fe
              est store regar3a
              xtregar ln_Current Participative c.Participative#i.crisis i.crisis HHI GDPG Inflation ln_Assets ln_GDP Political, fe
              est store regar3b


              Originally posted by Clyde Schechter View Post
              To diagnose a convergence problem in a stripped-down model like this one, with just an outcome, no fixed-effect regressor, and a single random intercept level, you can fit the same model, using a different estimator, with -xtreg, re-.

              Code:
              xtset Country
              xtreg ln_revenue, re
              This generalized least squares estimator frequently converges where -mixed-'s maximum likelihood does not. In my experience, what you frequently find is that the estimated variance component at the Country level is zero. The maximum likelihood estimator in -mixed- is incapable of estimating such a model because in -mixed-, you do not estimate the variance component itself, but rather its logarithm. If the variance component is zero, the logarithm is negative infinity--hence the divergence of the estimation process.
              I ran the specified model and this is the result;
              Code:
              . xtset Country
              string variables not allowed in varlist;
              Country is a string variable
              r(109);
              
              . 
              . xtreg ln_Revenue, re
              
              Random-effects GLS regression                   Number of obs     =    354,283
              Group variable: panel_id                        Number of groups  =     70,774
              
              R-squared:                                      Obs per group:
                   Within  = 0.0000                                         min =          1
                   Between = 0.0000                                         avg =        5.0
                   Overall = 0.0000                                         max =          8
              
                                                              Wald chi2(0)      =          .
              corr(u_i, X) = 0 (assumed)                      Prob > chi2       =          .
              
              ------------------------------------------------------------------------------
                ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
              -------------+----------------------------------------------------------------
                     _cons |   12.21423   .0094574  1291.50   0.000     12.19569    12.23276
              -------------+----------------------------------------------------------------
                   sigma_u |  2.4533427
                   sigma_e |  1.0377802
                       rho |  .84822348   (fraction of variance due to u_i)
              ------------------------------------------------------------------------------
              In my dofile I have it set like:
              Code:
              egen panel_id = group(Company Country)
              xtset panel_id year

              Originally posted by Clyde Schechter View Post
              The other thing you can do is re-estimate the -mixed- model adding the -iterate()- option. Your model already gets backed up at iteration 1, so if we try setting a limit of, say, 10 iterations, we should get a clear picture of what is going on with the estimates without wasting too much time.
              Code:
              mixed ln_revenue || Country:, iterate(10)
              -mixed- will terminate at the 10th iteration and show you its interim results. Bear in mind that it still has not converged and these results are not valid estimates of the model parameters. But they show you where the iterations are heading. What I suspect you will see is that the estimate for the variance component is some number extremely close to zero, like maybe 1e-12. This is another way of confirming that the problem is absence of variation at the Country level here.

              Anyway, if you end up confirming the absence of variation at the Country level in these ways, the solution is to simply go to a one-level model, using -regress- instead of -mixed-.

              Another thing to check is this: if you only have one observation per Country (after eliminating observations that have missing values on model variables), then you do not have two levels in the model, and -mixed- will fail to converge in this situation as well because there is no way to distinguish the Country level from the residual level variation. -mixed- has no check for this built in and it simply iterates endlessly and fails to converge when that happens.
              I ran the command with the following result;
              Code:
              . mixed ln_Revenue || Country:, iterate(10)
              
              Performing EM optimization ...
              
              Performing gradient-based optimization: 
              Iteration 0:  Log likelihood = -771948.38  
              Iteration 1:  Log likelihood = -771948.38  (backed up)
              Iteration 2:  Log likelihood = -771948.38  (backed up)
              Iteration 3:  Log likelihood = -771948.38  (backed up)
              Iteration 4:  Log likelihood = -771948.38  (backed up)
              Iteration 5:  Log likelihood = -771948.38  (backed up)
              Iteration 6:  Log likelihood = -771948.38  (backed up)
              Iteration 7:  Log likelihood = -771948.38  (backed up)
              Iteration 8:  Log likelihood = -771948.38  (backed up)
              Iteration 9:  Log likelihood = -771948.38  (backed up)
              Iteration 10: Log likelihood = -771948.38  (backed up)
              convergence not achieved
              
              Computing standard errors ...
              
              Mixed-effects ML regression                        Number of obs    =  354,283
              Group variable: Country                            Number of groups =       25
                                                                 Obs per group:
                                                                              min =      852
                                                                              avg = 14,171.3
                                                                              max =   65,676
                                                                 Wald chi2(0)     =        .
              Log likelihood = -771948.38                        Prob > chi2      =        .
              
              ------------------------------------------------------------------------------
                ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
              -------------+----------------------------------------------------------------
                     _cons |   13.07754    .262785    49.77   0.000     12.56249    13.59259
              ------------------------------------------------------------------------------
              
              ------------------------------------------------------------------------------
                Random-effects parameters  |   Estimate   Std. err.     [95% conf. interval]
              -----------------------------+------------------------------------------------
              Country: Identity            |
                                var(_cons) |   1.725026   .4882902      .9904954     3.00427
              -----------------------------+------------------------------------------------
                             var(Residual) |    4.56937    .010857       4.54814    4.590699
              ------------------------------------------------------------------------------
              LR test vs. linear model: chibar2(01) = 65316.63      Prob >= chibar2 = 0.0000
              
              Warning: Convergence not achieved.


              Originally posted by Clyde Schechter View Post
              All of that said, it is hard for me to believe that a variable that someone would choose to name ln_Revenue, regardless of whose revenue we are talking about, would be lacking in variance at the level of a variable someone would choose to call Country. So if these diagnostics confirm that the variance component at the Country level appears to be zero, I would be strongly suspicious that there is something seriously wrong with your data. As you have not so far shown any example data, there isn't anything more that to presently say about this.
              I am not sure what the best way is to provide you with a snippet of my data, so I have provided a small part below for now. Please let me know preferences and I will do my best to adhere to them. I must say I am trying to understand everything that has been said in this conversation, but some of it is above my knowledge right now so I will be reading up in the meantime.

              Data example;

              Code:
              Company    Country    year    GDPG    Inflation    GDP    Political    CharismaticValuebased    TeamOriented    SelfProtective    Participative    HumaneOriented    Autonomous    Origin    Firm_Age    crisis    ln_Assets    ln_GDP    panel_id    Comp1    Comp2    CharismaticControls    Comp3    Comp4    TeamControls    Comp5    Comp6    ParticipativeControls    HHI    ln_Revenue    ln_Current
               ALEMANNENHOF IMMOBILIEN UG (HAFTUNGSBESCHRAENKT) & CO. KG    Germany    2017    2.6802311    1.5094949    3.273e+12    .57438022    5.8552381    5.4995629    3.142521    5.7903018    4.5223285    4.3226094    2010    13    0    14.41644    28.81669    1    -.8002355    -.4903742    -.6453049    -.2817511    .3020582    .0101536    -.3272525    .726811    .1997792    73.54954    15.22405    .0168571
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2015    3.707316    -.06164468    1.129e+11    .73779249    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    14.34684    25.4495    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    451.1029    13.24492    -.0397809
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2016    2.2010019    .39476931    1.163e+11    .64338094    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    14.37207    25.47932    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    415.1374    13.23084    .1722712
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2017    4.271976    2.3482428    1.269e+11    .79619926    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    14.40654    25.56671    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    267.1993    13.2762    .3729419
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2018    5.3623484    2.8502479    1.361e+11    .7374177    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    14.50515    25.63655    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    263.1313    13.25035    .3825376
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2019    4.8642257    3.3385864    1.465e+11    .76182157    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    14.52408    25.70996    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    313.1815    13.47785    .6549259
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2020    -4.5355508    3.3267439    1.379e+11    .83697712    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    1    14.77254    25.6495    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    119.7928    12.82421    .9738047
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2021    7.2000003    5.1109653    1.541e+11    .79846376    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    1    14.80457    25.76115    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    124.7636    12.9403    1.088562
              " MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT.    Hungary    2022    4.5779058    14.608144    1.654e+11    .63595128    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    1    14.77145    25.83179    2    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    160.0994    13.16006    1.329195
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2015    3.707316    -.06164468    1.129e+11    .73779249    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    11.13446    25.4495    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    451.1029    11.30707    .3527673
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2016    2.2010019    .39476931    1.163e+11    .64338094    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    11.09795    25.47932    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    415.1374    10.23463    .8069222
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2017    4.271976    2.3482428    1.269e+11    .79619926    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    11.06693    25.56671    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    267.1993    10.40667    1.817265
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2018    5.3623484    2.8502479    1.361e+11    .7374177    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    10.93964    25.63655    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    263.1313    8.547912    .2715527
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2019    4.8642257    3.3385864    1.465e+11    .76182157    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    0    10.92995    25.70996    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    313.1815    8.856904    .1814879
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2020    -4.5355508    3.3267439    1.379e+11    .83697712    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    1    10.93485    25.6495    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    119.7928    8.170839    .1739533
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2021    7.2000003    5.1109653    1.541e+11    .79846376    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    1    10.96431    25.76115    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    124.7636    9.705115    1.138794
              " SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG    Hungary    2022    4.5779058    14.608144    1.654e+11    .63595128    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1993    30    1    10.93583    25.83179    3    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    160.0994    9.468645    .1079571
              "AGNES PANZIO" IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG    Hungary    2021    7.2000003    5.1109653    1.541e+11    .79846376    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1990    33    1    14.51004    25.76115    4    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    124.7636    9.438313    .9566646
              "AGROHELPS" MEZOGAZDASAGI SZOLGALTATO ES KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG    Hungary    2015    3.707316    -.06164468    1.129e+11    .73779249    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1991    32    0    11.79927    25.4495    5    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    451.1029    8.601623    2.498974
              "AGROHELPS" MEZOGAZDASAGI SZOLGALTATO ES KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG    Hungary    2016    2.2010019    .39476931    1.163e+11    .64338094    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1991    32    0    11.78531    25.47932    5    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    415.1374    5.391879    2.127517
              "AGROHELPS" MEZOGAZDASAGI SZOLGALTATO ES KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG    Hungary    2022    4.5779058    14.608144    1.654e+11    .63595128    5.908886    5.9123523    3.2383026    5.223756    4.7324426    3.2349727    1991    32    1    11.74535    25.83179    5    -.1202972    .9725738    .4261383    -.1927895    .8260766    .3166435    .328516    -1.460935    -.5662097    160.0994    9.051645    -1.931022

              There are some variables in there that I disregarded in my FE model as they could not remain, but if possible I would include them in the mlm. Once again, thanks to both of you for taking the time trying to help me, I really appreciate it.

              Comment


              • #8
                Well, I am surprised to see the non-convergence of -mixed- as the data looks satisfactory by visual inspection. (The layout you posted is not readily importable to Stata, so I didn't do anything more with it. In the future, please use the -dataex- command to show example data.) I am also surprised that with -iterate(10)- you got what look like perfectly reasonable results. And, similarly, -xtreg, re- gave you satisfactory looking results. The good news is that you can use the -xtreg, re- results instead of -mixed-: as noted earlier, it's the same model estimated with a different estimator.

                It is possible that -mixed- would converge if you specified one of the non-default methods (-bhhh-, -dfp-, or -bfgs-) in the -technique()- option. I don't know if it' worth the trouble of trying one of these just to see. If it works, the results will almost certainly be a very close match to what -xtreg, re- gives you. If it doesn't work, nothing will have been accomplished.

                It's unsatisfying not to know what's causing this difficulty for -mixed-. Unlike single-level regression, the likelihood function for -mixed- is not guaranteed to be mathematically well behaved and this kind of thing can happen.

                I do have one recommendation for when you put the rest of your variables back in the model. You have one regressor there whose values are of the order of magnitude 1011. This is another potential source of trouble. I recommend that you rescale that variable down so that the values are more like the same order of magnitude as the rest of the variables. This is nothing other than a change of units and has no substantive consequences--just remember that the coefficient for that variable will be scaled up by the same factor as a result, and interpret it accordingly. I recommend this because data with variables of widely different orders of magnitude can also cause convergence problems.

                Comment


                • #9
                  There is a lot of good advice from Clyde and Daniel. I think there is something wrong with your Country variable. I count 25 countries in the table you showed above. Yet, in the xtreg output you get an error about the Country variable being string. In addition, it reports that you have 70,774 groups. I would suggest that you create a numeric variable for country, as such:
                  Code:
                  egen countryid = group(country)
                  Then use countryid in your xtreg call (xtreg ln_Revenue, i(countryid) re). That will get the GLS estimates on the same plane as the ML estimates in mixed. Please post those results.

                  Comment


                  • #10
                    Thanks, Jesse, for the detailed response. I don't see an obvious smoking gun here. Notice as well that the constant term (12.21423 in the xt model, 13.07754 in the mixed model) appears to be within the domain of the ln_Revenue variable - another possible explanation eliminated.

                    I do notice that each time you try to estimate the mixed model, the log likelihood does not change and you get the "backed up" message. This suggests that the likelihood function is relatively flat in the region near the starting point, and it can't find a "step" that leads to a better log likelihood. It looks like -mixed- will not let you specify your own starting condition, but you can modify the required change in the coefficient before the model converges with the tolerance() option. You might try tolerances ranging from tolerance(1e-5) to tolerance(1e-8) to see if you can break out of that region of the likelihood function. See the -maximize- pdf entry under "remarks and examples" for more advice from the documentation.

                    Another option is to try the rmle estimator by specifying the rmle option.

                    Code:
                    mixed ln_Revenue || Country:, rmle
                    You would typically use rmle in cases where you have a small sample size, but it is possible it will converge here.

                    The -dataex- command is the best way to provide a data example. The example you have in the last code block would require a fair amount of editing before Clyde or I could load it ourselves in Stata. The -dataex- command addresses this and other related issues.

                    Edit: crossed with #8 and #9

                    Comment


                    • #11
                      That was a good catch by Erik in #9. Since xtset Country produced an error, but OP was able to estimate an xt model directly after that line, it seems like the state is such that xt was set up like this for the model in #7:

                      Code:
                      egen panel_id = group(Company Country)
                      xtset panel_id year

                      Comment


                      • #12
                        Good points, Daniel. It isn't clear to me which version of Country is being used in the xtreg and mixed calls. Also, I think you meant to suggest the use of the option reml in the mixed call.

                        Comment


                        • #13
                          If #11 is correct, as seems likely because the number of distinct countries must be less than about 200, then the results shown in #7 for -xtreg, re- are for the wrong model and cannot be used. It should be
                          Code:
                          egen country = group(Country)
                          xtset country
                          xtreg ln_Revenue, re
                          If this estimation runs successfully and shows a sigma_u that is very close to zero, then we know that zero or negative intra-class correlation is the source of the problem. But seeing the results of -mixed- in #11, that is unlikely to happen. If the estimation runs successfully and shows a sigma_u that is removed from zero, then its results are usable instead of -mixed-. And the next step would be to add in the regressor variables, sticking with -xtreg, re-.

                          Comment


                          • #14
                            Thanks to all of you for your quick and very insightful replies. I will go through all advice you gave me and see what I can make work now.

                            Originally posted by Clyde Schechter View Post
                            Well, I am surprised to see the non-convergence of -mixed- as the data looks satisfactory by visual inspection. (The layout you posted is not readily importable to Stata, so I didn't do anything more with it. In the future, please use the -dataex- command to show example data.) I am also surprised that with -iterate(10)- you got what look like perfectly reasonable results. And, similarly, -xtreg, re- gave you satisfactory looking results. The good news is that you can use the -xtreg, re- results instead of -mixed-: as noted earlier, it's the same model estimated with a different estimator.
                            Given other research I looked into I assumed I could use an xtreg / xtregar model for my objectives, but the feedback I received was quite adament that I was using the wrong model and insisted I used a mlm approach given the hierarchical nature of my data. I see your point though so if I cannot get it to run I will proceed with the RE model and justify my choices there. With regard to the dataex command, thanks for your clarification. I was not aware of its existence. Just to overcome any potential issue I did not identify myself, I'll post the command results below;

                            Code:
                            * Example generated by -dataex-. For more info, type help dataex
                            clear
                            input str199 Company str52 Country int year float ln_Revenue double CharismaticValuebased float(crisis HHI) double(Political GDPG Inflation) float(ln_GDP ln_Assets)
                            " ALEMANNENHOF IMMOBILIEN UG (HAFTUNGSBESCHRAENKT) & CO. KG"                                        "Germany" 2017  15.22405 5.8552381123975215 0  73.54954 .574380218982697 2.6802311140589126 1.509494851 28.816694 14.416435
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2015  13.24492  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495 14.346838
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2016 13.230836  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932  14.37207
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2017 13.276198  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 14.406542
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2018 13.250348  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 14.505152
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2019 13.477852  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 14.524078
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2020 12.824208  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 14.772535
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2021 12.940297  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115  14.80457
                            `"" MORAHALMI ROZSMALOM " MEZOGAZDASAGI TERMELO ES KERESKEDELMI KFT."'                              "Hungary" 2022  13.16006  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 14.771445
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2015  11.30707  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495  11.13446
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2016 10.234633  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 11.097946
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2017 10.406672  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 11.066935
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2018  8.547912  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 10.939645
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2019  8.856904  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 10.929947
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2020  8.170839  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 10.934853
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2021  9.705115  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115  10.96431
                            `"" SZEMES-TOURS " KERESKEDELMI ES SZOLGALTATO BETETI TARSASAG"'                                    "Hungary" 2022  9.468645  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785  10.93583
                            `""AGNES PANZIO" IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                                    "Hungary" 2021 9.4383135  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115  14.51004
                            `""AGROHELPS" MEZOGAZDASAGI SZOLGALTATO ES KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG"'            "Hungary" 2015  8.601623  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495 11.799273
                            `""AGROHELPS" MEZOGAZDASAGI SZOLGALTATO ES KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG"'            "Hungary" 2016  5.391879  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 11.785306
                            `""AGROHELPS" MEZOGAZDASAGI SZOLGALTATO ES KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG"'            "Hungary" 2022  9.051645  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 11.745347
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2015 13.936698  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495    14.159
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2016 14.048585  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 14.221438
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2017 14.026814  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 14.267086
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2018  14.13766  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655  14.28456
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2019  14.18461  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 14.317113
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2020 13.650452  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 14.203485
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2021  14.02651  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115   14.3904
                            `""ALFOLD 92" GYOGYSZALLODA ES IDEGENFORGALMI KORLATOLT FELELOSSEGU TARSASAG"'                      "Hungary" 2022  14.50388  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 14.617276
                            `""AM ERZENGEL" HOTEL- RESTAURANT GMBH & CO. KG"'                                                   "Germany" 2018 13.997832 5.8552381123975215 0 68.876976 .577707588672638  .9812326060047383 1.732168798  28.84549 14.388364
                            `""AM ERZENGEL" HOTEL- RESTAURANT GMBH & CO. KG"'                                                   "Germany" 2019  14.01436 5.8552381123975215 0  63.60983 .548454642295837 1.0566038982828871 1.445659769 28.875673  14.29847
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2015  10.10865  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495  12.20429
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2016  11.28211  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932  12.32091
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2017 11.863593  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 12.316428
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2018 13.569273  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655  13.35193
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2019  11.72802  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 13.375835
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2020 11.644264  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 13.381702
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2021 12.494555  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 13.419947
                            `""ANDRASSY & TARSAI" IPARI-, KERESKEDELMI- ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'         "Hungary" 2022 12.699424  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785  13.41597
                            `""ANIMO FOGADO" KERESKEDELMI, VENDEGLATO ES SZALLASHELYERTEKESITO KORLATOLT FELELOSSEGU TARSASAG"' "Hungary" 2015 10.547794  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495  9.987274
                            `""ANIMO FOGADO" KERESKEDELMI, VENDEGLATO ES SZALLASHELYERTEKESITO KORLATOLT FELELOSSEGU TARSASAG"' "Hungary" 2016 10.579391  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932  9.989796
                            `""ANIMO FOGADO" KERESKEDELMI, VENDEGLATO ES SZALLASHELYERTEKESITO KORLATOLT FELELOSSEGU TARSASAG"' "Hungary" 2017 10.731054  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 10.391806
                            `""ANIMO FOGADO" KERESKEDELMI, VENDEGLATO ES SZALLASHELYERTEKESITO KORLATOLT FELELOSSEGU TARSASAG"' "Hungary" 2018  10.97058  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 10.493157
                            `""ANIMO FOGADO" KERESKEDELMI, VENDEGLATO ES SZALLASHELYERTEKESITO KORLATOLT FELELOSSEGU TARSASAG"' "Hungary" 2019 10.895823  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 10.362118
                            `""ANIMO FOGADO" KERESKEDELMI, VENDEGLATO ES SZALLASHELYERTEKESITO KORLATOLT FELELOSSEGU TARSASAG"' "Hungary" 2020 11.057007  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 10.792476
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2015 14.657694  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495 16.721018
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2016 14.700603  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 16.097305
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2017 14.810548  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 16.087805
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2018 14.838492  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 16.075306
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2019 14.933668  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957  16.09865
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2020 14.529696  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 16.353466
                            `""ANNA GRAND INVEST" KORLATOLT FELELOSSEGU TARSASAG"'                                              "Hungary" 2021 14.709572  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 16.468513
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2015  11.55615  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495 11.734488
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2016 11.615582  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 11.795084
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2017 11.680666  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 11.866075
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2018 11.828084  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655  12.00275
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2019   12.0477  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957  12.36648
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2020 12.136326  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 12.549459
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2021 12.374446  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115  12.80396
                            `""BAKONYVARI ES BAKONYVARI" KERESKEDELMI ES SZOLGALTATO ES IDEGENFORGALMI BETETI TARSASAG"'        "Hungary" 2022 12.580807  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 12.887493
                            `""BAKOS ''97" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                         "Hungary" 2019 10.724598  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 12.164067
                            `""BAKOS ''97" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                         "Hungary" 2020  10.77486  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 12.366808
                            `""BAKOS ''97" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                         "Hungary" 2021 10.317814  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 12.365256
                            `""BAKOS ''97" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                         "Hungary" 2022  11.53333  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 12.467048
                            `""BALATON-KERAMIA" KERESKEDELMI BETETI TARSASAG"'                                                  "Hungary" 2019  3.503197  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957  10.63659
                            `""BALATON-KERAMIA" KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG"'                                   "Hungary" 2020   .914542  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 10.421083
                            `""BALATON-KERAMIA" KERESKEDELMI KORLATOLT FELELOSSEGU TARSASAG"'                                   "Hungary" 2022   .914542  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785  9.760233
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2015 10.586098  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495  12.29983
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2016 10.599003  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 12.283937
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2017 10.110987  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 12.263906
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2018  5.989716  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 12.225858
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2019  6.090692  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 12.211043
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2020   .914542  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 12.212855
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2021  14.23871  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 14.168292
                            `""BALATON-MOBIL" KERESKEDELMI, SZOLGALTATO ES EPITO KORLATOLT FELELOSSEGU TARSASAG"'               "Hungary" 2022 10.486674  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785  9.383469
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2015 12.496583  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495 13.548524
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2016 12.523968  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932  13.54113
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2017  12.71381  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671  13.57681
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2018 12.785366  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655  13.54394
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2019  12.88581  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 13.577804
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2020 12.673898  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 13.627605
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2021 12.735033  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 13.727483
                            `""BALOGH CSEMEGE" KERESKEDELMI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                     "Hungary" 2022 13.190304  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 13.847804
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2015   9.20207  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495  9.313942
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2016  9.392994  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932  9.733379
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2017 10.689343  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 10.413544
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2018 10.944178  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 10.737437
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2019 10.717602  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 10.965648
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2020  9.609044  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495 10.885754
                            `""BAROKK ANTIK" KERESKEDELMI BETETI TARSASAG"'                                                     "Hungary" 2021 10.062262  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 10.952578
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2015 11.077312  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495 13.151524
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2016 11.199953  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 13.182643
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2017    11.163  5.908885973022453 0  267.1993 .796199262142181  4.271976016206949 2.348242812  25.56671 13.209945
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2018 11.124963  5.908885973022453 0 263.13126 .737417697906494   5.36234836176223 2.850247926  25.63655 13.153912
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2019   11.5128  5.908885973022453 0  313.1815 .761821568012238  4.864225735118907 3.338586354 25.709957 13.251202
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2020 11.462564  5.908885973022453 1 119.79285 .836977124214172 -4.535550832815673 3.326743858   25.6495  13.30979
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2021  11.57977  5.908885973022453 1 124.76358 .798463761806488   7.20000034603936 5.110965344  25.76115 13.349658
                            `""BELI" KERESKEDELMI ES VENDEGLATO KORLATOLT FELELOSSEGU TARSASAG"'                                "Hungary" 2022 11.770845  5.908885973022453 1  160.0994 .635951280593872  4.577905769260298 14.60814395 25.831785 13.436004
                            `""BETEKINTS" VENDEGLATOIPARI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                       "Hungary" 2015 13.622828  5.908885973022453 0  451.1029 .737792491912842 3.7073159793626047  -.06164468   25.4495  14.15845
                            `""BETEKINTS" VENDEGLATOIPARI ES SZOLGALTATO KORLATOLT FELELOSSEGU TARSASAG"'                       "Hungary" 2016  13.67364  5.908885973022453 0  415.1374  .64338093996048 2.2010018929385637  .394769307  25.47932 14.162148
                            end

                            End of part 1 of my reply due to character limit

                            Comment


                            • #15

                              Start of part 2 of my reply due to character limit.


                              Originally posted by Clyde Schechter View Post
                              It is possible that -mixed- would converge if you specified one of the non-default methods (-bhhh-, -dfp-, or -bfgs-) in the -technique()- option. I don't know if it' worth the trouble of trying one of these just to see. If it works, the results will almost certainly be a very close match to what -xtreg, re- gives you. If it doesn't work, nothing will have been accomplished.

                              It's unsatisfying not to know what's causing this difficulty for -mixed-. Unlike single-level regression, the likelihood function for -mixed- is not guaranteed to be mathematically well behaved and this kind of thing can happen.
                              I decided to try all three just to check, here are my results;

                              Code:
                              mixed ln_Revenue CharismaticValuebased i.crisis c.CharismaticValuebased#i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets || Country: , technique(bfgs) iterate(10
                              > )
                              
                              Performing EM optimization ...
                              
                              Performing gradient-based optimization: 
                              Iteration 0:  Log likelihood = -687343.55  
                              Iteration 1:  Log likelihood = -687343.55  (backed up)
                              Iteration 2:  Log likelihood = -687343.55  (backed up)
                              BFGS stepping has contracted, resetting BFGS Hessian
                              Iteration 3:  Log likelihood = -687343.55  (backed up)
                              Iteration 4:  Log likelihood = -687343.55  (backed up)
                              Iteration 5:  Log likelihood = -687343.55  (backed up)
                              Iteration 6:  Log likelihood = -687343.55  (backed up)
                              Iteration 7:  Log likelihood = -687343.55  (backed up)
                              BFGS stepping has contracted, resetting BFGS Hessian
                              Iteration 8:  Log likelihood = -687343.55  (backed up)
                              Iteration 9:  Log likelihood = -687343.55  (backed up)
                              Iteration 10: Log likelihood = -687343.55  (backed up)
                              convergence not achieved
                              
                              Computing standard errors ...
                              
                              Mixed-effects ML regression                       Number of obs    =   354,283
                              Group variable: Country                           Number of groups =        25
                                                                                Obs per group:
                                                                                             min =       852
                                                                                             avg =  14,171.3
                                                                                             max =    65,676
                                                                                Wald chi2(9)     = 216935.24
                              Log likelihood = -687343.55                       Prob > chi2      =    0.0000
                              
                              ------------------------------------------------------------------------------------------------
                                                  ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
                              -------------------------------+----------------------------------------------------------------
                                       CharismaticValuebased |   .6425498   .3643428     1.76   0.078     -.071549    1.356649
                                                    1.crisis |  -.2174034   .1057229    -2.06   0.040    -.4246164   -.0101904
                                                             |
                              crisis#c.CharismaticValuebased |
                                                          1  |    -.03602   .0183068    -1.97   0.049    -.0719006   -.0001394
                                                             |
                                                         HHI |   .0001683   .0000567     2.97   0.003      .000057    .0002795
                                                   Political |  -.1756486   .0253796    -6.92   0.000    -.2253918   -.1259054
                                                        GDPG |   .0232012   .0007719    30.06   0.000     .0216882    .0247142
                                                   Inflation |   .0279139   .0014549    19.19   0.000     .0250623    .0307654
                                                      ln_GDP |   .4946039    .040405    12.24   0.000     .4154115    .5737963
                                                   ln_Assets |   .6557599   .0014367   456.43   0.000      .652944    .6585758
                                                       _cons |  -13.07198   2.437139    -5.36   0.000    -17.84869   -8.295278
                              ------------------------------------------------------------------------------------------------
                              
                              ------------------------------------------------------------------------------
                                Random-effects parameters  |   Estimate   Std. err.     [95% conf. interval]
                              -----------------------------+------------------------------------------------
                              Country: Identity            |
                                                var(_cons) |    .355695   .1054035      .1989942    .6357922
                              -----------------------------+------------------------------------------------
                                             var(Residual) |   2.834431   .0067348      2.821262    2.847662
                              ------------------------------------------------------------------------------
                              LR test vs. linear model: chibar2(01) = 12861.66      Prob >= chibar2 = 0.0000
                              
                              Warning: Convergence not achieved.
                              
                              . mixed ln_Revenue CharismaticValuebased i.crisis c.CharismaticValuebased#i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets || Country: , technique(dfp) iterate(10)
                              
                              Performing EM optimization ...
                              
                              Performing gradient-based optimization: 
                              Iteration 0:  Log likelihood = -687343.55  
                              Iteration 1:  Log likelihood = -687343.55  (backed up)
                              Iteration 2:  Log likelihood = -687343.55  (backed up)
                              DFP stepping has contracted, resetting DFP Hessian
                              Iteration 3:  Log likelihood = -687343.55  (backed up)
                              Iteration 4:  Log likelihood = -687343.55  (backed up)
                              Iteration 5:  Log likelihood = -687343.55  (backed up)
                              Iteration 6:  Log likelihood = -687343.55  (backed up)
                              Iteration 7:  Log likelihood = -687343.55  (backed up)
                              DFP stepping has contracted, resetting DFP Hessian
                              Iteration 8:  Log likelihood = -687343.55  (backed up)
                              Iteration 9:  Log likelihood = -687343.55  (backed up)
                              Iteration 10: Log likelihood = -687343.55  (backed up)
                              convergence not achieved
                              
                              Computing standard errors ...
                              
                              Mixed-effects ML regression                       Number of obs    =   354,283
                              Group variable: Country                           Number of groups =        25
                                                                                Obs per group:
                                                                                             min =       852
                                                                                             avg =  14,171.3
                                                                                             max =    65,676
                                                                                Wald chi2(9)     = 216935.24
                              Log likelihood = -687343.55                       Prob > chi2      =    0.0000
                              
                              ------------------------------------------------------------------------------------------------
                                                  ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
                              -------------------------------+----------------------------------------------------------------
                                       CharismaticValuebased |   .6425498   .3643428     1.76   0.078     -.071549    1.356649
                                                    1.crisis |  -.2174034   .1057229    -2.06   0.040    -.4246164   -.0101904
                                                             |
                              crisis#c.CharismaticValuebased |
                                                          1  |    -.03602   .0183068    -1.97   0.049    -.0719006   -.0001394
                                                             |
                                                         HHI |   .0001683   .0000567     2.97   0.003      .000057    .0002795
                                                   Political |  -.1756486   .0253796    -6.92   0.000    -.2253918   -.1259054
                                                        GDPG |   .0232012   .0007719    30.06   0.000     .0216882    .0247142
                                                   Inflation |   .0279139   .0014549    19.19   0.000     .0250623    .0307654
                                                      ln_GDP |   .4946039    .040405    12.24   0.000     .4154115    .5737963
                                                   ln_Assets |   .6557599   .0014367   456.43   0.000      .652944    .6585758
                                                       _cons |  -13.07198   2.437139    -5.36   0.000    -17.84869   -8.295278
                              ------------------------------------------------------------------------------------------------
                              
                              ------------------------------------------------------------------------------
                                Random-effects parameters  |   Estimate   Std. err.     [95% conf. interval]
                              -----------------------------+------------------------------------------------
                              Country: Identity            |
                                                var(_cons) |    .355695   .1054035      .1989942    .6357922
                              -----------------------------+------------------------------------------------
                                             var(Residual) |   2.834431   .0067348      2.821262    2.847662
                              ------------------------------------------------------------------------------
                              LR test vs. linear model: chibar2(01) = 12861.66      Prob >= chibar2 = 0.0000
                              
                              Warning: Convergence not achieved.
                              
                              . mixed ln_Revenue CharismaticValuebased i.crisis c.CharismaticValuebased#i.crisis HHI Political GDPG Inflation ln_GDP ln_Assets || Country: , technique(bhhh) iterate(10
                              > )
                              option technique(bhhh) not allowed



                              Originally posted by Clyde Schechter View Post
                              I do have one recommendation for when you put the rest of your variables back in the model. You have one regressor there whose values are of the order of magnitude 1011. This is another potential source of trouble. I recommend that you rescale that variable down so that the values are more like the same order of magnitude as the rest of the variables. This is nothing other than a change of units and has no substantive consequences--just remember that the coefficient for that variable will be scaled up by the same factor as a result, and interpret it accordingly. I recommend this because data with variables of widely different orders of magnitude can also cause convergence problems.
                              I see I made an error in the way I tried sending example data earlier, I no longer use my original GDP value as I proceeded with my ln_GDP variable for the regressions, so I hope that should address this issue sufficiently.

                              Originally posted by Erik Ruzek View Post
                              There is a lot of good advice from Clyde and Daniel. I think there is something wrong with your Country variable. I count 25 countries in the table you showed above. Yet, in the xtreg output you get an error about the Country variable being string. In addition, it reports that you have 70,774 groups. I would suggest that you create a numeric variable for country, as such:
                              Code:
                              egen countryid = group(country)
                              Then use countryid in your xtreg call (xtreg ln_Revenue, i(countryid) re). That will get the GLS estimates on the same plane as the ML estimates in mixed. Please post those results.
                              I am sorry about the confusion, I must say I have been running so many different models and options these past weeks, I sometimes lose grip of my current approach as well. I decided to run my dofile to get it cleared out and now I check my variable Country. In my own model I specified it like;
                              Code:
                              . //setting panel data
                              . egen panel_id = group(Company Country)
                              
                              . xtset panel_id year
                              
                              Panel variable: panel_id (unbalanced)
                               Time variable: year, 2015 to 2022, but with gaps
                                       Delta: 1 unit
                              With country being a string variable according to the variable overview. Using Running it as specified by Erik in #9 gives;

                              Code:
                               
                               egen countryid = group(country) xtreg ln_Revenue, i(countryid) re
                              then gives the following output;
                              Code:
                               xtreg ln_Revenue, i(countryid) re
                              warning: existing panel variable is not countryid
                              
                              Random-effects GLS regression                   Number of obs     =    354,283
                              Group variable: countryid                       Number of groups  =         25
                              
                              R-squared:                                      Obs per group:
                                   Within  = 0.0000                                         min =        852
                                   Between = 0.0000                                         avg =   14,171.3
                                   Overall = 0.0000                                         max =     65,676
                              
                                                                              Wald chi2(0)      =          .
                              corr(u_i, X) = 0 (assumed)                      Prob > chi2       =          .
                              
                              ------------------------------------------------------------------------------
                                ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
                              -------------+----------------------------------------------------------------
                                     _cons |   13.07756    .268242    48.75   0.000     12.55182    13.60331
                              -------------+----------------------------------------------------------------
                                   sigma_u |  1.3406981
                                   sigma_e |  2.1376084
                                       rho |  .28231761   (fraction of variance due to u_i)
                              ------------------------------------------------------------------------------
                              I also tried running the code specified by Clyde in #13;

                              Code:
                               
                               egen country = group(Country) xtset country xtreg ln_Revenue, re
                              which resulted in the following output:
                              Code:
                              Random-effects GLS regression                   Number of obs     =    354,283
                              Group variable: country                         Number of groups  =         25
                              
                              R-squared:                                      Obs per group:
                                   Within  = 0.0000                                         min =        852
                                   Between = 0.0000                                         avg =   14,171.3
                                   Overall = 0.0000                                         max =     65,676
                              
                                                                              Wald chi2(0)      =          .
                              corr(u_i, X) = 0 (assumed)                      Prob > chi2       =          .
                              
                              ------------------------------------------------------------------------------
                                ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
                              -------------+----------------------------------------------------------------
                                     _cons |   13.07756    .268242    48.75   0.000     12.55182    13.60331
                              -------------+----------------------------------------------------------------
                                   sigma_u |  1.3406981
                                   sigma_e |  2.1376084
                                       rho |  .28231761   (fraction of variance due to u_i)
                              ------------------------------------------------------------------------------

                              Originally posted by Daniel Schaefer View Post
                              I do notice that each time you try to estimate the mixed model, the log likelihood does not change and you get the "backed up" message. This suggests that the likelihood function is relatively flat in the region near the starting point, and it can't find a "step" that leads to a better log likelihood. It looks like -mixed- will not let you specify your own starting condition, but you can modify the required change in the coefficient before the model converges with the tolerance() option. You might try tolerances ranging from tolerance(1e-5) to tolerance(1e-8) to see if you can break out of that region of the likelihood function. See the -maximize- pdf entry under "remarks and examples" for more advice from the documentation.
                              Thanks so much for providing a source Daniel, I will go through it first thing in the morning and try the options you specified.


                              Originally posted by Daniel Schaefer View Post

                              Another option is to try the rmle estimator by specifying the rmle option.

                              Code:
                              mixed ln_Revenue || Country:, reml iterate(10)
                              You would typically use rmle in cases where you have a small sample size, but it is possible it will converge here.
                              With regard to this option, I adjusted the code based on some snippets from earlier in this thread. I got the following result;

                              Code:
                               mixed ln_Revenue || Country:, reml iterate(10)
                              
                              Performing EM optimization ...
                              
                              Performing gradient-based optimization: 
                              Iteration 0:  Log restricted-likelihood = -771948.79  
                              Iteration 1:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 2:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 3:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 4:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 5:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 6:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 7:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 8:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 9:  Log restricted-likelihood = -771948.79  (backed up)
                              Iteration 10: Log restricted-likelihood = -771948.79  (backed up)
                              convergence not achieved
                              
                              Computing standard errors ...
                              
                              Mixed-effects REML regression                      Number of obs    =  354,283
                              Group variable: Country                            Number of groups =       25
                                                                                 Obs per group:
                                                                                              min =      852
                                                                                              avg = 14,171.3
                                                                                              max =   65,676
                                                                                 Wald chi2(0)     =        .
                              Log restricted-likelihood = -771948.79             Prob > chi2      =        .
                              
                              ------------------------------------------------------------------------------
                                ln_Revenue | Coefficient  Std. err.      z    P>|z|     [95% conf. interval]
                              -------------+----------------------------------------------------------------
                                     _cons |   13.07756   .2682057    48.76   0.000     12.55189    13.60324
                              ------------------------------------------------------------------------------
                              
                              ------------------------------------------------------------------------------
                                Random-effects parameters  |   Estimate   Std. err.     [95% conf. interval]
                              -----------------------------+------------------------------------------------
                              Country: Identity            |
                                                var(_cons) |   1.796985   .5191118      1.020117    3.165476
                              -----------------------------+------------------------------------------------
                                             var(Residual) |    4.56937    .010857       4.54814    4.590699
                              ------------------------------------------------------------------------------
                              LR test vs. linear model: chibar2(01) = 65325.05      Prob >= chibar2 = 0.0000
                              
                              Warning: Convergence not achieved.

                              I am sorry for the lengthy reply, and some of the information I posted might be redundant for you. I am trying to the best of my ability to process all the insightful information you have provided me with, but I can definitely see that my knowledge in this area could use some further refinements..

                              Thanks everyone again for taking the time on this Christmas day to help me out, I hope you all had a wonderful Christmas.

                              Comment

                              Working...
                              X