Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

Boosting

  • Reference work entry
  • First Online:
  • 343Accesses

  • 1Citation

Definition

Boosting is a kind of ensemble methods [13] which produces a strong learner that is capable of making very accurate predictions by combining rough and moderately inaccurate learners (which are called asbase learners orweak learners). In particular, boosting sequentially trains a series of base learners by using abase learning algorithm, where the training examples wrongly predicted by a base learner will receive more attention from the successive base learner. After that, it generates a final strong learner through a weighted combination of these base learners.

Historical Background

In 1989, Kearns and Valiant posed an interesting theoretical question, i.e., whether two complexity classes,weakly learnable andstrongly learnable problems, are equal. In other words, whether aweak learning algorithm that performs just slightly better than random guess can be boosted into an arbitrarily accuratestrong learning algorithm. In 1990, Schapire [9] proved that the answer to the...

This is a preview of subscription content,log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 857998
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
JPY 857998
Price includes VAT (Japan)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Recommended Reading

  1. Bauer E, Kohavi R. An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach Learn. 1999;36(1–2):105–39.

    Article  Google Scholar 

  2. Breiman L. Prediction games and arcing classifiers. Neural Comput. 1999;11(7):1493–517.

    Article  Google Scholar 

  3. Freund Y, Schapire RE. A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci. 1997;55(1):119–39 (A short version appeared in the Proceedings of EuroCOLT’95).

    Article MathSciNet MATH  Google Scholar 

  4. Friedman J, Hastie T, Tibshirani R. Additive logistic regression: a statistical view of boosting with discussions. Ann Stat. 2000;28(2):337–407.

    Article MATH  Google Scholar 

  5. Gao W, Zhou Z-H. On the doubt about margin explanation of boosting. Artif Intell. 2013;203:1–18.

    Article MathSciNet MATH  Google Scholar 

  6. Meir R, Rätsch G. An introduction to boosting and leveraging. In: Mendelson S, Smola AJ, editors. Advanced lectures in machine learning. LNCS vol. 2600. Berlin: Springer; 2003. p. 118–83.

    Chapter  Google Scholar 

  7. Opitz D, Maclin R. Popular ensemble methods: an empirical study. J Artif Intell Res. 1999;11(1):169–98.

    Article MATH  Google Scholar 

  8. Reyzin L, Schapire RE. How boosting the margin can also boost classifier complexity. In: Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh; 2006. p. 753–60.

    Google Scholar 

  9. Schapire RE. The strength of weak learn ability. Mach Learn. 1990;5(2):197–227.

    Google Scholar 

  10. Schapire RE, Freund Y, Bartlett P, Lee WS. Boosting the margin: a new explanation for the effectiveness of voting methods. Ann Stat. 1998;26(5):1651–86.

    Article MathSciNet MATH  Google Scholar 

  11. Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition; 2001. p. 511–8.

    Google Scholar 

  12. Wang L, Sugiyama M, Yang C, Zhou Z.H, Feng J. On the margin explanation of boosting algorithm. In: Proceedings of the 21st Annual Conference on Learning Theory; 2008. p. 479–90.

    Google Scholar 

  13. Zhou Z-H. Ensemble methods: foundations and algorithms. Boca Raton: CRC Press; 2012.

    Book  Google Scholar 

  14. Zhou Z-H. Large margin distribution learning. In: Proceedings of Artificial Neural Networks in Pattern Recognition; 2014.

    Google Scholar 

Download references

Author information

Authors and Affiliations

  1. National Key Lab for Novel Software Technology, Nanjing University, Nanjing, China

    Zhi-Hua Zhou

Authors
  1. Zhi-Hua Zhou

Corresponding author

Correspondence toZhi-Hua Zhou.

Editor information

Editors and Affiliations

  1. Georgia Institute of Technology College of Computing, Atlanta, GA, USA

    Ling Liu

  2. University of Waterloo School of Computer Science, Waterloo, ON, Canada

    M. Tamer Özsu

Section Editor information

  1. School of Elec. Eng. and Computer Science, Seoul National Univ., Seoul, Republic of Korea

    Kyuseok Shim

Rights and permissions

Copyright information

© 2018 Springer Science+Business Media, LLC, part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Zhou, ZH. (2018). Boosting. In: Liu, L., Özsu, M.T. (eds) Encyclopedia of Database Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8265-9_568

Download citation

Publish with us

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 857998
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
JPY 857998
Price includes VAT (Japan)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only


[8]ページ先頭

©2009-2025 Movatter.jp