Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Robbert Henk
    replied
    Hi Sebastian,

    thank you for your answer. Could you perhaps also explain why it is more efficient?

    thanks for all your help

    Leave a comment:


  • Sebastian Kripfganz
    replied
    If there is no cointegration / no long-run relationship, you can still use the error correction model for the short-run dynamics but it would be more efficient to just estimate the model directly in first differences.

    Leave a comment:


  • Robbert Henk
    replied
    Hello,

    I have (again) another question about the ARDL model. if there is no cointegration, can you still use the error correction model (the short run model) ? In more general terms, in case of no cointegration, can you use all the functionalities of the model?

    Leave a comment:


  • Sebastian Kripfganz
    replied
    It could just be an omitted variable bias in the smaller model that lets your coefficients become larger and thus statistically significant. Alternatively, in the larger model, the estimates are less precise due to more coefficients to be estimated, which increases the standard errors and thus the chance to obtain insignificant results.

    Leave a comment:


  • Adam Ng
    replied
    Sebastian Kripfganz , apologies , it should be bounds test. Also when I tried running the ARDL model with two variables y(dependent) and x1, there is a significant LR relationship but when I ran it with three variables y, x1 and x2, there is insignificant LR relationship for both x1 and x2. What would be the intuition behind this?
    Thanks!

    Leave a comment:


  • Sebastian Kripfganz
    replied
    Assuming that you are referring to the bounds test for the existence of a level relationship, you do not have to take first differences of the variables in the first place. Otherwise, I do not know what you mean by "ARDL test".

    The optimal lag selection performed by the ardl command would become much harder when time-series operators were allowed. For this technical reason, we do not allow time-series operators. If needed, you could generate differenced variables in a first step and then use these new variables in the specification of the ardl command.

    Please ensure that you have the latest version of our ardl command. For details, please see the following new Statalist topic and please ask any further questions about this new command version in this new topic (to avoid any confusion with some outdated comments above): ARDL: updated Stata command for the estimation of autoregressive distributed lag and error correction models

    Leave a comment:


  • Adam Ng
    replied
    Dear Sebastian Kripfganz ,
    I read somewhere that while performing the ARDL test, regardless of whether the variables are I(0) or I(1), we have to make it stationary (hence if it's of I(1), we would take first difference for the test). However, when I tried running ardl (depvar) d.(IndepVar) (IndepVar), it says "factor-variable and time-series operators not allowed". What does this mean?
    Thank you!
    Last edited by Adam Ng; 21 Apr 2018, 07:27.

    Leave a comment:


  • Oumayma Bahammou
    replied
    it works now, thanks for your precious help.

    Leave a comment:


  • Sebastian Kripfganz
    replied
    Please see my earlier comment #293 above. I recommend that you rescale you gdpdh variable (divide by 1000 or one million).

    Please ensure that you have the latest version of our ardl command. For details, please see the following new Statalist topic and please ask any further questions about this new command version in this new topic (to avoid any confusion with some outdated comments above): ARDL: updated Stata command for the estimation of autoregressive distributed lag and error correction models

    Leave a comment:


  • Oumayma Bahammou
    replied
    Hi everyone,
    I am running an ARDL model with only 2 variables, with 35 observations, and I get the error ''maximum iterations are exceeded'' !!
    Click image for larger version

Name:	error1.png
Views:	1
Size:	19.3 KB
ID:	1438443

    Leave a comment:


  • Sebastian Kripfganz
    replied
    Originally posted by Franz Langmann View Post
    Code:
    Model has k=11 weakly exogenous variables, but
    Pesaran/Shin/Smith (2001) critical values are only tabulated up to k=10.
    Do you know how to indeed get the F-statistics? I guess it would be a reasonable approach to compare the F-statistic for k=11 with the last bound value (k=10) in PS (2001)?
    Please update to the latest version of our ardl command and use the new postestimation command estat ectest. It not only computes the F- and t-statistics for situations with more than 10 regressors, it also reports appropriate critical values and approximate p-values.

    For details, please see the new Statalist topic and please ask any further questions about this new command version in this new topic (to avoid any confusion with some outdated comments above):
    ARDL: updated Stata command for the estimation of autoregressive distributed lag and error correction models

    Leave a comment:


  • Franz Langmann
    replied
    Hi Sebastian,

    Amazing! Thanks for the excellent and fast help.

    A further issue, which occurs in my model is that the number of regressors exceeds the values reported in PS(2001). Therefore, the bounds test gives me the following error:

    Code:
    Model has k=11 weakly exogenous variables, but
    Pesaran/Shin/Smith (2001) critical values are only tabulated up to k=10.
    Do you know how to indeed get the F-statistics? I guess it would be a reasonable approach to compare the F-statistic for k=11 with the last bound value (k=10) in PS (2001)?

    Many thanks!

    Leave a comment:


  • Sebastian Kripfganz
    replied
    It is not primarily a problem of the X14 dummy. When you inspect the ardl level output (without option ec), you will notice that both the coefficient and the standard error of X3 are very small. These tiny estimates create numerical problems in the computation of the delta-method standard errors when using the ec option. As you have figured out already, this problem is linked to Stata's nlcom command; see for example the following old discussion: https://www.stata.com/statalist/arch.../msg01244.html

    When you summarize the data, you will notice that the values of X3 are much larger than those of all other variables (by a magnitude of about 1000). I thus recommend to just rescale this regressor as follows:
    Code:
    gen X3s = X3 / 1000
    ardl Y X1-X2 X3s X4-X11, trend(ORDER) exog(X12-X14) ec lags(2 3 3 0 0 0 0 0 0 0 0 0)
    This should work.

    Leave a comment:


  • Franz Langmann
    replied
    Originally posted by Sebastian Kripfganz View Post
    I could not replicate the problem with an artificial data set of the same dimensions. There seems to be an idiosyncratic problem with your data set. Would it be possible to share your data set with us (or any subset of the data set that can be used to replicate the problem)?
    Thanks for the quick response! I added the dataset to my previous post. I think it depends on some issue with including the trend variable and X14 as exogenous variable, which is a time break dummy variable.

    Leave a comment:


  • Sebastian Kripfganz
    replied
    I could not replicate the problem with an artificial data set of the same dimensions. There seems to be an idiosyncratic problem with your data set. Would it be possible to share your data set with us (or any subset of the data set that can be used to replicate the problem)?

    Leave a comment:

Working...
X