Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

LightGBM

From Wikipedia, the free encyclopedia
Microsoft open source gradient boosting framework for machine learning
LightGBM
Original authorGuolin Ke[1] /Microsoft Research
DevelopersMicrosoft and LightGBM contributors[2]
Initial release2016; 10 years ago (2016)
Stable release
v4.3.0[3] / January 15, 2024; 2 years ago (2024-01-15)
Written inC++,Python,R,C
Operating systemWindows,macOS,Linux
TypeMachine learning,gradient boosting framework
LicenseMIT License
Websitelightgbm.readthedocs.io
Repositorygithub.com/microsoft/LightGBM

LightGBM, short forLight Gradient-Boosting Machine, is afree and open-source distributedgradient-boosting framework formachine learning, originally developed byMicrosoft.[4][5] It is based ondecision tree algorithms and used forranking,classification and other machine learning tasks. The development focus is on performance and scalability.

Overview

[edit]

The LightGBM framework supports different algorithms including GBT,GBDT,GBRT,GBM,MART[6][7] andRF.[8] LightGBM has many ofXGBoost's advantages, including sparse optimization, parallel training, multiple loss functions, regularization, bagging, and early stopping. A major difference between the two lies in the construction of trees. LightGBM does not grow a tree level-wise — row by row — as most other implementations do.[9] Instead it grows trees leaf-wise. It will choose the leaf with max delta loss to grow.[10] Besides, LightGBM does not use the widely used sorted-based decision tree learning algorithm, which searches the best split point on sorted feature values,[11] asXGBoost or other implementations do. Instead, LightGBM implements a highly optimized histogram-based decision tree learning algorithm, which yields great advantages on both efficiency and memory consumption.[12] The LightGBM algorithm utilizes two novel techniques called Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB) which allow the algorithm to run faster while maintaining a high level of accuracy.[13]

LightGBM works onLinux,Windows, andmacOS and supportsC++,Python,[14]R, andC#.[15] The source code is licensed underMIT License and available onGitHub.[16]

Gradient-based one-side sampling

[edit]

When usinggradient descent, one thinks about the space of possible configurations of the model as a valley, in which the lowest part of the valley is the model which most closely fits the data. In this metaphor, one walks in different directions to learn how much lower the valley becomes.

Typically, in gradient descent, one uses the whole set of data to calculate the valley's slopes. However, this commonly used method assumes that every data point is equally informative.

By contrast, Gradient-Based One-Side Sampling (GOSS), a method first developed forgradient-boosted decision trees, does not rely on the assumption that all data are equally informative. Instead, it treats data points with smaller gradients (shallower slopes) as less informative by randomly dropping them. This is intended to filter out data which may have been influenced by noise, allowing the model to more accurately model the underlying relationships in the data.[13]

Exclusive feature bundling

[edit]

Exclusive feature bundling (EFB) is a near-lossless method to reduce the number of effective features. In a sparse feature space many features are nearly exclusive, implying they rarely take nonzero values simultaneously. One-hot encoded features are a perfect example of exclusive features. EFB bundles these features, reducing dimensionality to improve efficiency while maintaining a high level of accuracy. The bundle of exclusive features into a single feature is called an exclusive feature bundle.[13]

See also

[edit]

References

[edit]
  1. ^"Guolin Ke".GitHub.
  2. ^"microsoft/LightGBM".GitHub. 7 July 2022.
  3. ^"Releases · microsoft/LightGBM".GitHub.
  4. ^Brownlee, Jason (March 31, 2020)."Gradient Boosting with Scikit-Learn, XGBoost, LightGBM, and CatBoost".
  5. ^Kopitar, Leon; Kocbek, Primoz; Cilar, Leona; Sheikh, Aziz; Stiglic, Gregor (July 20, 2020)."Early detection of type 2 diabetes mellitus using machine learning-based prediction models".Scientific Reports.10 (1): 11981.Bibcode:2020NatSR..1011981K.doi:10.1038/s41598-020-68771-z.PMC 7371679.PMID 32686721 – via www.nature.com.
  6. ^"Understanding LightGBM Parameters (and How to Tune Them)".neptune.ai. May 6, 2020.
  7. ^"An Overview of LightGBM".avanwyk. May 16, 2018.
  8. ^"Parameters — LightGBM 3.0.0.99 documentation".lightgbm.readthedocs.io.
  9. ^The Gradient Boosters IV: LightGBM – Deep & Shallow
  10. ^"Features".LightGBM Official Documentation. Nov 3, 2024.
  11. ^Manish, Mehta; Rakesh, Agrawal; Jorma, Rissanen (Nov 24, 2020). "SLIQ: A fast scalable classifier for data mining".International Conference on Extending Database Technology:18–32.CiteSeerX 10.1.1.89.7734.
  12. ^"Features — LightGBM 3.1.0.99 documentation".lightgbm.readthedocs.io.
  13. ^abcKe, Guolin; Meng, Qi; Finley, Thomas; Wang, Taifeng; Chen, Wei; Ma, Weidong; Ye, Qiwei; Liu, Tie-Yan (2017)."LightGBM: A Highly Efficient Gradient Boosting Decision Tree".Advances in Neural Information Processing Systems.30.
  14. ^"lightgbm: LightGBM Python Package". 7 July 2022 – via PyPI.
  15. ^"Microsoft.ML.Trainers.LightGbm Namespace".docs.microsoft.com.
  16. ^"microsoft/LightGBM". October 6, 2020 – via GitHub.

Further reading

[edit]

External links

[edit]
Overview
Software
Applications
Video games
Programming
languages
Frameworks,
development tools
Operating systems
Other
Licenses
Forges
Related
Main
projects
Languages, compilers
Distributedgrid computing
Internet,networking
Other projects
Operating systems
APIs
Launched as products
MSR Labs
applied
research
Live Labs
Current
Discontinued
FUSE Labs
Other labs
Portal:
Retrieved from "https://en.wikipedia.org/w/index.php?title=LightGBM&oldid=1327447676"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp