XGBRegressor is much slower than GradientBoostingRegressor

I am new to xgboostand trying to learn how to use it, comparing it with the traditional one gbm. However, I noticed that it is xgboostmuch slower than gbm. Example:

from sklearn.model_selection import KFold, GridSearchCV
from sklearn.ensemble import GradientBoostingRegressor
from xgboost import XGBRegressor
from sklearn.datasets import load_boston
import time

boston = load_boston()
X = boston.data
y = boston.target

kf = KFold(n_splits = 5)
cv_params = {'cv': kf, 'scoring': 'r2', 'n_jobs': 4, 'verbose': 1}

gbm = GradientBoostingRegressor()
xgb = XGBRegressor()

grid = {'n_estimators': [100, 300, 500], 'max_depth': [3, 5]}

timer = time.time()
gbm_cv = GridSearchCV(gbm, param_grid = grid, **cv_params).fit(X, y)
print('GBM time: ', time.time() - timer)

timer = time.time()
xgb_cv = GridSearchCV(xgb, param_grid = grid, **cv_params).fit(X, y)
print('XGB time: ', time.time() - timer)

On a Macbook Pro with 8 cores, the output is:

Fitting 5 folds for each of 6 candidates, totalling 30 fits
[Parallel(n_jobs=4)]: Done  30 out of  30 | elapsed:    1.9s finished
GBM time:  2.262791872024536
Fitting 5 folds for each of 6 candidates, totalling 30 fits
[Parallel(n_jobs=4)]: Done  30 out of  30 | elapsed:   16.4s finished
XGB time:  17.902266025543213

I thought xgboost should be much faster, so I should be doing something wrong. Can someone help indicate what I'm doing wrong?

+4
source share
1 answer

This is the output when working on my machine without setting the parameter n_jobstocv_params

Fitting 5 folds for each of 6 candidates, totalling 30 fits
[Parallel(n_jobs=1)]: Done  30 out of  30 | elapsed:    4.1s finished
('GBM time: ', 4.248916864395142)
Fitting 5 folds for each of 6 candidates, totalling 30 fits
('XGB time: ', 2.934467077255249)
[Parallel(n_jobs=1)]: Done  30 out of  30 | elapsed:    2.9s finished

If the parameter is n_jobsset to 4, the value for GBM is 2.5 s , but for XGB it takes a very long time.

, , n_jobs! , XGBoost- n_jobs GridSearchCV.

+1

Source: https://habr.com/ru/post/1663911/


All Articles