Slicing Models

Slice tree model

Whenbooster is set togbtree ordart, XGBoost builds a tree model, which is alist of trees and can be sliced into multiple sub-models.

importxgboostasxgbfromsklearn.datasetsimportmake_classificationnum_classes=3X,y=make_classification(n_samples=1000,n_informative=5,n_classes=num_classes)dtrain=xgb.DMatrix(data=X,label=y)num_parallel_tree=4num_boost_round=16# total number of built trees is num_parallel_tree * num_classes * num_boost_round# We build a boosted random forest for classification here.booster=xgb.train({'num_parallel_tree':4,'subsample':0.5,'num_class':3},num_boost_round=num_boost_round,dtrain=dtrain)# This is the sliced model, containing [3, 7) forests# step is also supported with some limitations like negative step is invalid.sliced:xgb.Booster=booster[3:7]# Access individual tree layertrees=[_for_inbooster]assertlen(trees)==num_boost_round

The sliced model is a copy of selected trees, that means the model itself is immutableduring slicing. This feature is the basis ofsave_best option in early stoppingcallback. SeeDemo for prediction using individual trees and model slices for a worked example onhow to combine prediction with sliced trees.

Note

The returned model slice doesn’t contain attributes likebest_iteration andbest_score.