Hi Statalist,
After spending all afternoon figuring out how to install packages (scikit-learn) on Python, create a virtual environment and get Python to talk to Stata via the new ddml package - I am sad to have retrieved the following longwinded error message:
After loading and cleaning my data, the rest of the relevant code is:
Would anyone have any suggestions on how to solve - except for going to the authors? Grateful for any tips.
Kind regards,
Hannah
After spending all afternoon figuring out how to install packages (scikit-learn) on Python, create a virtual environment and get Python to talk to Stata via the new ddml package - I am sad to have retrieved the following longwinded error message:
Code:
. ddml crossfit Cross-fitting E[y|X,D] equation: nllonely Resample 1... Cross-fitting fold 1 2 3 4 5 ...completed cross-fitting Resample 2... Cross-fitting fold 1 2 3 4 5 ...completed cross-fitting Resample 3... Cross-fitting fold 1 2 3 4 5 ...completed cross-fitting Resample 4... Cross-fitting fold 1 2 3 4 5 ...completed cross-fitting Resample 5... Cross-fitting fold 1 2 3 4 5 ...completed cross-fitting Cross-fitting E[D|X] equation: source Resample 1... Cross-fitting fold 1 joblib.externals.loky.process_executor._RemoteTraceback: """ Traceback (most recent call last): File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\jo > blib\_utils.py", line 72, in __call__ return self.func(**kwargs) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\jo > blib\parallel.py", line 598, in __call__ return [func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\utils\parallel.py", line 136, in __call__ return self.function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\ensemble\_base.py", line 40, in _fit_single_estimator estimator.fit(X, y, **fit_params) File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\base.py", line 1473, in wrapper return fit_method(estimator, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\pipeline.py", line 473, in fit self._final_estimator.fit(Xt, y, **last_step_params["fit"]) File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\base.py", line 1466, in wrapper estimator._validate_params() File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\base.py", line 666, in _validate_params validate_parameter_constraints( File "C:\Users\uqhbeilb\AppData\Local\Programs\Python\Python312\Lib\site-packages\sk > learn\utils\_param_validation.py", line 95, in validate_parameter_constraints raise InvalidParameterError( sklearn.utils._param_validation.InvalidParameterError: The 'loss' parameter of Gradien > tBoostingClassifier must be a str among {'exponential', 'log_loss'}. Got 'deviance' > instead. """ The above exception was the direct cause of the following exception: # Repeat of above from "Traceback ... Got 'deviance' instead'" r(7102);
After loading and cleaning my data, the rest of the relevant code is:
Code:
* DDML - interactive model global Y nllonely global D source global X male dagecat2 dagecat3 dagecat4 dagecat5 dagecat6 fulltime parttime unemployed uni postgrad kids couplenodeps hshareother mid advant nbh_id_7pt pnq3 set seed 123 *estimate the model 5 times using randomly chosen folds. ddml init interactive, kfolds(5) reps(5) *consider two supervised learners: linear regression and gradient boosted trees, stacked using pystacked. ddml E[Y|X,D]: pystacked $Y $X, type(reg) methods(ols gradboost) ddml E[D|X]: pystacked $D $X, type(class) methods(logit gradboost) *cross-fit - stops at E[D|X] equation: source Cross-fitting fold 1 ddml crossfit *estimate the average treatment effect (the default) ddml estimate
Would anyone have any suggestions on how to solve - except for going to the authors? Grateful for any tips.
Kind regards,
Hannah
Comment