Xgboost and its integration with the sklearn feature_importances_ function

I use XGBoost and its sklearn packaging.

Whenever I try to print feature_importances_ , it comes up with the following error:

ValueError: invalid literal for int () with base 10

Digging in the code, I found out that the feature_importances_ property is a method ( with empty parameters ) from the original booster. This method explicitly returns a dictionary similar to this:

 {'feat_name1':5,'feat_name2':8,...,'feat_nameN':1} 

So, given that feature_importances_ applies the int transformation to keys, it exposes the rationale for the message .

 keys = [int(k.replace('f', '')) for k in fs.keys()] #this is the conflictive line of code 

So my question is twofold here:

1- is this an error , and therefore should I report it (or even fix and request pull)?

2- is there something that I am missing with the get_fscore function and its fmap param?

+5
source share
1 answer

I suggest reporting this as an error on the XGBoost Github website: https://github.com/dmlc/xgboost/issues

+2
source

Source: https://habr.com/ru/post/1245759/


All Articles