XGBoost value does not have max_num_features property

xgboost print API :

xgboost.plot_importance(booster, ax=None, height=0.2, xlim=None, ylim=None, title='Feature importance', xlabel='F score', ylabel='Features', importance_type='weight', max_num_features=None, grid=True, **kwargs)ΒΆ

The value of the plot is based on equipped trees.

Parameters:

booster (Booster, XGBModel or dict) – Booster or XGBModel instance, or dict taken by Booster.get_fscore()
...
max_num_features (int, default None) – Maximum number of top features displayed on plot. If None, all features will be displayed.

In my implementation, however, it executes:

booster_ = XGBClassifier(learning_rate=0.1, max_depth=3, n_estimators=100, 
                      silent=False, objective='binary:logistic', nthread=-1, 
                      gamma=0, min_child_weight=1, max_delta_step=0, subsample=1, 
                      colsample_bytree=1, colsample_bylevel=1, reg_alpha=0,
                      reg_lambda=1, scale_pos_weight=1, base_score=0.5, seed=0)

booster_.fit(X_train, y_train)

from xgboost import plot_importance
plot_importance(booster_, max_num_features=10)

Return:

AttributeError: Unknown property max_num_features

When launched without a parameter, it max_num_featurescorrectly displays the entire set of functions (which in my case is gigantic, ~ 10k functions). Any ideas on what's going on?

Thanks in advance.

More details:

> python -V
  Python 2.7.12 :: Anaconda custom (x86_64)

> pip freeze | grep xgboost
  xgboost==0.4a30
+4
source share
4 answers

Try updating the xgboost library to 0.6. He must solve the problem. To upgrade the package, try the following:

$ pip install -U xgboost

If you receive an error message, try the following:

$ brew install gcc@5
$ pip install -U xgboost

( https://github.com/dmlc/xgboost/issues/1501)

+5

- ( " API Python - xgboost 0.6" ), 0,6 xgboost. , , git.

0.6 xgboost 29 2016 .:

This is a stable release of 0.6 version

@tqchen tqchen released this on Jul 29 2016 Β· 245 commits to master since this release

, plot_importance() max_num_features, 16 2017 .:

, 0.60:

pushd /tmp
curl -SLO https://github.com/dmlc/xgboost/archive/v0.60.tar.gz
tar -xf v0.60.tar.gz 
grep num_features xgboost-0.60/python-package/xgboost/plotting.py
# .. silence.

, , xgboost.

+2

( ) script:

def feat_imp(df, model, n_features):

    d = dict(zip(df.columns, model.feature_importances_))
    ss = sorted(d, key=d.get, reverse=True)
    top_names = ss[0:n_features]

    plt.figure(figsize=(15,15))
    plt.title("Feature importances")
    plt.bar(range(n_features), [d[i] for i in top_names], color="r", align="center")
    plt.xlim(-1, n_features)
    plt.xticks(range(n_features), top_names, rotation='vertical')

 feat_imp(filled_train_full, booster_, 20)

enter image description here

+1

- . , , . , , :

max = 50
xgboost.plot_importance(dict(sorted(bst.get_fscore().items(), reverse = True, key=lambda x:x[1])[:max]), ax = ax, height = 0.8)

dict , fscore, , , dict.

, - , certian , , .

+1

Source: https://habr.com/ru/post/1670847/


All Articles