Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

microGBT is a minimalistic Gradient Boosting Trees implementation

License

NotificationsYou must be signed in to change notification settings

zouzias/microgbt

Repository files navigation

Codacy BadgeBuild StatusCoverage StatusLicense

microGBT

microGBT is a minimalistic (606 LOC) Gradient Boosting Trees implementation in C++11 followingxgboost's paper, i.e., the tree building process is based on the gradient and Hessian vectors (Newton-Raphson method).

A minimalist Python API is available usingpybind11. To use it,

importmicrogbtpyparams= {"gamma":0.1,"lambda":1.0,"max_depth":4.0,"shrinkage_rate":1.0,"min_split_gain":0.1,"learning_rate":0.1,"min_tree_size":3,"num_boosting_rounds":100.0,"metric":0.0}gbt=microgbtpy.GBT(params)# Traininggbt.train(X_train,y_train,X_valid,y_valid,num_iters,early_stopping_rounds)# Predicty_pred=gbt.predict(x,gbt.best_iteration())

Goals

The main goal of the project is to be educational and provide a minimalistic codebase that allows experimentation with Gradient Boosting Trees.

Features

Currently, the following loss functions are supported:

  • Logistic loss for binary classification,logloss.h
  • Root Mean Squared Error (RMSE) for regression,rmse.h

Set the parametermetric to 0.0 and 1.0 for logistic regression and RMSE, respectively.

Installation

To install locally

pip install git+https://github.com/zouzias/microgbt.git

Then, follow the instructions to run the titanic classification dataset.

Development (docker)

git clone https://github.com/zouzias/microgbt.gitcd microgbtdocker-compose build microgbtdocker-compose run microgbt./runBuild

Binary Classification (Titanic)

A binary classification example using theTitanic dataset. Run

cd examples/./test-titanic.py

the output should include

              precision    recall  f1-score   support           0       0.75      0.96      0.84        78           1       0.91      0.55      0.69        56   micro avg       0.79      0.79      0.79       134   macro avg       0.83      0.76      0.77       134weighted avg       0.82      0.79      0.78       134`

Regression Example (Lightgbm)

To run the LightGBM regressionexample, type

cd examples/./test-lightgbm-example.py

the output should end with

2019-05-19 22:54:04,825 - __main__ - INFO - *************[Testing]*************2019-05-19 22:54:04,825 - __main__ - INFO - ******************************2019-05-19 22:54:04,825 - __main__ - INFO - * [Testing]RMSE=0.4471202019-05-19 22:54:04,826 - __main__ - INFO - * [Testing]R^2-Score=0.1940942019-05-19 22:54:04,826 - __main__ - INFO - ******************************

About

microGBT is a minimalistic Gradient Boosting Trees implementation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp