Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Dynamic Panel Model for Large T and Small N.

    Hello Everyone,

    I am working on a problem in which I am assessing the risk of credit default in relation to different policies in different sectors of a nation. I gathered and created data monthly from 2000 to 2020, thus T=240 in my example, and there are only eleven sectors, so N=11. All my independent variables are exogenous. And I have found an autocorrelation in my dependent variable, particularly with AR (1). With such dynamic panel data, I was more tempted to employ a dynamic approach such as GMM Estimator (xtabond2). Still, with more investigation, I discovered that this method is inconsistent with such a big T and is more often utilized in scenarios when N>T. I'd want some advice on how to approach such situations. I even found some great advice from this helpful community, like,

    xtregar and xtscc with fixed effect model.

    My question is can I add a lag variable of my dependent variable in the last two methods? If yes, how is it valid?

    And my last inquiry will be other than monthly can I utilize dynamic techniques such as GMM Estimator (xtabond2) if I desire to solve my issue on an annual basis, where T=20 and N=11? As T will be much smaller, I have my doubts as it does not meet the N>T criterion. Please point me on the right path.

  • #2
    The Nickell (1981) bias will be reduced as T/N tends to infinity, which is probably not your case.

    You may want to ask Sebastian Kripfganz who coded a command called xtdpdgmm.

    Comment


    • #3
      Dear Maxence,

      Thank you very much for your reply! Please correct me if I'm wrong here, after going over the paper Biases in Dynamic Models with Fixed Effects by Stephen Nickell, I included a lag variable of my dependent variable in the monthly case the T is large, I inferred the Nickell Bias would reduce substantially. But of course, that won't be valid for the yearly study I want to set up. Am I right here?

      Of course, any advice from Prof.Sebastian Kripfganz on this will help me strengthen my understanding of the issue in a much deeper fashion.

      Comment


      • #4
        Well the bigger T/N the better, but it really has to tend to infinity for the Nickell bias to disappear, and 240/11 might not quite cut it...

        While we wait for Prof. Kripfganz's response, you could download his programme from ssc and check the help file.

        Comment


        • #5
          The dynamic panel data GMM estimators are typically designed for a large-N/small-T situation. In your case, N=11 is extremely small, irrespective of T. These estimators might therefore not be very reliable. In any case, you should not use the two-step version of these estimators because estimating the optimal weighting matrix for the second step is pointless with such small N.

          With T=240, the Nickell bias is of no concern at all. You can just put a lagged dependent variable into the conventional fixed-effects models and use the traditional estimators.

          With T=20, things become borderline. With highly persistent macro data, I would expect the Nickell bias to still matter. If all variables besides the lagged dependent variable are strictly exogenous, a better alternative to a GMM estimator might be an estimator which directly corrects for the bias -- see my xtdpdbc command - or a QML estimator -- see my xtdpdqml command. However, their performance under such small N might still be problematic.
          https://www.kripfganz.de/stata/

          Comment


          • #6
            Dear Prof. Kripfganz,

            Thank you very much for your recommendations and for clearing my doubts. I'll definitely explore the two options you've mentioned.

            Comment


            • #7
              What you have is multiple time series, and what happens in this case is that you need to worry about the stationarity of your data. Assuming that your data is stationary, appropriate estimators here are the user written -xtscc-, and the native -xtgls- and -sureg-.

              Comment


              • #8
                Joro Kolev

                Dear Prof. Kolev,

                Thank you very much for your direction. Yes, my data is stationary, and I have previously implemented -xtscc & -xtgls and obtained intriguing results for the T=240 and N=11 instance of the long panel dataset.

                Will these estimators work for the other scenario where N=11 and T=20 that I mentioned? As Prof. Kripfganz said in his previous post, I am uncertain if these commands would help for my case. Prof. Kripfganz suggested the commands -xtdpdbc and -xtdpdqml, which I am now investigating. Do you advise anything for this scenario (N=11, T=20)?

                I would also want to use this occasion to ask whether it is feasible to estimate using a panel model when N is really small (for example, N=3 and T=20)? I was intending to use a Bayesian hierarchical model, say bayes command with xtreg. My belief was this would allow for the inclusion of past knowledge and can be more resilient to small sample numbers. Will this be a good approach?

                Thank you for your direction!

                Comment


                • #9
                  Sebastian Kripfganz Hello Professor: I have a panel data sample with T = 37 (1980-2017) and no. of countries N=71. I am using first differences of both the dependent and independent variable and have done a fixed effects regression so far. I wanted to know if using GMM is a better approach in my case to avoid any biases or what i have done is relevant or if there is anything else i can do!

                  Thank you for your time!

                  Comment

                  Working...
                  X