Last active
August 24, 2022 10:54
-
-
Save erdogant/46061811c5e434fcaf62ac9bd80af7d8 to your computer and use it in GitHub Desktop.
hgboost
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # Fit | |
| results = hgb.xgboost(X, y, pos_label=1, eval_metric='auc') | |
| # results = hgb.catboost(X, y, pos_label=1, eval_metric='auc') | |
| # results = hgb.lightboost(X, y, pos_label=1, eval_metric='auc') | |
| # [hgboost] >Start hgboost classification. | |
| # [hgboost] >Collecting xgb_clf parameters. | |
| # [hgboost] >Correct for unbalanced classes using [scale_pos_weight].. | |
| # [hgboost] >[13] hyperparameters in gridsearch space. Used loss function: [auc]. | |
| # [hgboost] >method: xgb_clf | |
| # [hgboost] >eval_metric: auc | |
| # [hgboost] >greater_is_better: True | |
| # [hgboost] >********************************************************************************* | |
| # [hgboost] >Total dataset: (891, 203) | |
| # [hgboost] >Validation set: (179, 203) | |
| # [hgboost] >Test-set: (278, 203) | |
| # [hgboost] >Train-set: (434, 203) | |
| # [hgboost] >********************************************************************************* | |
| # [hgboost] >Searching across hyperparameter space for best performing parameters using maximum nr. evaluations: 250 | |
| # 100%|██████████| 250/250 [03:18<00:00, 1.26trial/s, best loss: -0.867519265453353] | |
| # [hgboost]> Collecting the hyperparameters from the [250] trials. | |
| # [hgboost] >[auc]: 0.8675 Best performing model across 250 iterations using Bayesian Optimization with Hyperopt. | |
| # [hgboost] >********************************************************************************* | |
| # [hgboost] >5-fold cross validation for the top 10 scoring models, Total nr. tests: 50 | |
| # [hgboost] >[auc] (average): 0.8701 Best 5-fold CV model using optimized hyperparameters. | |
| # [hgboost] >********************************************************************************* | |
| # [hgboost] >Evaluate best [xgb_clf] model on validation dataset (179 samples, 20%) | |
| # [hgboost] >[auc]: -0.8443 using optimized hyperparameters on validation set. | |
| # [hgboost] >[auc]: -0.7912 using default (not optimized) parameters on validation set. | |
| # [hgboost] >********************************************************************************* | |
| # [hgboost] >Retrain [xgb_clf] on the entire dataset with the optimal hyperparameters. | |
| # [hgboost] >Fin! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment