- Notifications
You must be signed in to change notification settings - Fork0
microGBT is a minimalistic Gradient Boosting Trees implementation
License
zouzias/microgbt
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
microGBT is a minimalistic (606 LOC) Gradient Boosting Trees implementation in C++11 followingxgboost's paper, i.e., the tree building process is based on the gradient and Hessian vectors (Newton-Raphson method).
A minimalist Python API is available usingpybind11. To use it,
importmicrogbtpyparams= {"gamma":0.1,"lambda":1.0,"max_depth":4.0,"shrinkage_rate":1.0,"min_split_gain":0.1,"learning_rate":0.1,"min_tree_size":3,"num_boosting_rounds":100.0,"metric":0.0}gbt=microgbtpy.GBT(params)# Traininggbt.train(X_train,y_train,X_valid,y_valid,num_iters,early_stopping_rounds)# Predicty_pred=gbt.predict(x,gbt.best_iteration())
The main goal of the project is to be educational and provide a minimalistic codebase that allows experimentation with Gradient Boosting Trees.
Currently, the following loss functions are supported:
- Logistic loss for binary classification,
logloss.h
- Root Mean Squared Error (RMSE) for regression,
rmse.h
Set the parametermetric
to 0.0 and 1.0 for logistic regression and RMSE, respectively.
To install locally
pip install git+https://github.com/zouzias/microgbt.git
Then, follow the instructions to run the titanic classification dataset.
git clone https://github.com/zouzias/microgbt.gitcd microgbtdocker-compose build microgbtdocker-compose run microgbt./runBuild
A binary classification example using theTitanic dataset. Run
cd examples/./test-titanic.py
the output should include
precision recall f1-score support 0 0.75 0.96 0.84 78 1 0.91 0.55 0.69 56 micro avg 0.79 0.79 0.79 134 macro avg 0.83 0.76 0.77 134weighted avg 0.82 0.79 0.78 134`
To run the LightGBM regressionexample, type
cd examples/./test-lightgbm-example.py
the output should end with
2019-05-19 22:54:04,825 - __main__ - INFO - *************[Testing]*************2019-05-19 22:54:04,825 - __main__ - INFO - ******************************2019-05-19 22:54:04,825 - __main__ - INFO - * [Testing]RMSE=0.4471202019-05-19 22:54:04,826 - __main__ - INFO - * [Testing]R^2-Score=0.1940942019-05-19 22:54:04,826 - __main__ - INFO - ******************************
About
microGBT is a minimalistic Gradient Boosting Trees implementation
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors2
Uh oh!
There was an error while loading.Please reload this page.