Xgboost xgb.dump tree coefficient

I have some sample code here.

data(agaricus.train, package='xgboost')
train <- agaricus.train
bst <- xgboost(data = train$data, label = train$label, max.depth = 2,
eta = 1, nthread = 2, nround = 2,objective = "binary:logistic")
xgb.dump(bst, 'xgb.model.dump', with.stats = TRUE)

After creating the model, I print it as

booster[0]
0:[f28<-1.00136e-05] yes=1,no=2,missing=1,gain=4000.53,cover=1628.25
    1:[f55<-1.00136e-05] yes=3,no=4,missing=3,gain=1158.21,cover=924.5
        3:leaf=1.71218,cover=812
        4:leaf=-1.70044,cover=112.5
    2:[f108<-1.00136e-05] yes=5,no=6,missing=5,gain=198.174,cover=703.75
        5:leaf=-1.94071,cover=690.5
        6:leaf=1.85965,cover=13.25
booster[1]
0:[f59<-1.00136e-05] yes=1,no=2,missing=1,gain=832.545,cover=788.852
    1:[f28<-1.00136e-05] yes=3,no=4,missing=3,gain=569.725,cover=768.39
        3:leaf=0.784718,cover=458.937
        4:leaf=-0.96853,cover=309.453
    2:leaf=-6.23624,cover=20.4624

I have some questions:

  • I understand that Gradient averages increase the results of these trees using some weighted coefficients. How can I get these odds?

  • Just to clarify. The value predicted by the trees is leaf = x, isn't it?

Thanks.

+4
source share
1 answer

Combined answer for Q1 and Q2:

The coefficient for all tree metrics for xgboost is 1. Just summarize all the leaf ratings. Let the sum be S. Then apply a logical (2-class) function to it: Pr (label = 1) = 1 / (1 + exp (-S))

.

+3

Source: https://habr.com/ru/post/1607492/


All Articles