Movatterモバイル変換


[0]ホーム

URL:


 
Please wait...

We can help you reset your password using the email address linked to your Project Euclid account.

 
Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches. Please note that a Project Euclid web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content. Contactcustomer_support@projecteuclid.org with any questions.
View Project Euclid Privacy Policy
 
All Fields are Required
*
*
*
*
Password Requirements: Minimum 8 characters, must include as least one uppercase, one lowercase letter, and one number or permitted symbol Valid Symbols for password:
~ Tilde
! Exclamation Mark
@ At sign
$ Dollar sign
^ Caret
( Opening Parenthesis
) Closing Parenthesis
_ Underscore
. Period
*
Please wait...
Web Account created successfully
Project Euclid
Advanced Search
Home> Journals> Ann. Appl. Stat.> Volume 4> Issue 1>Article
Open Access
March 2010BART: Bayesian additive regression trees
Hugh A. Chipman,Edward I. George,Robert E. McCulloch
Ann. Appl. Stat.4(1):266-298(March 2010).DOI: 10.1214/09-AOAS285
PERSONAL SIGN IN
Full access may be available with your subscription
 
PURCHASE THIS CONTENT
PURCHASE SINGLE ARTICLE
Price:$30.00ADD TO CART
Includes PDF & HTML, when available
PURCHASE SINGLE ARTICLE
This article is only available tosubscribers. It is not available for individual sale.
This will count as one of your downloads.
You will have access to both the presentation and article (if available).
This content is available for download via your institution's subscription. To access this item, please sign in to your personal account.
 
No Project Euclid account?Create an account
My Library
You currently do not have any folders to save your paper to! Create a new folder below.

Abstract

We develop a Bayesian “sum-of-trees” model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior. Effectively, BART is a nonparametric Bayesian regression approach which uses dimensionally adaptive random basis elements. Motivated by ensemble methods in general, and boosting algorithms in particular, BART is defined by a statistical model: a prior and a likelihood. This approach enables full posterior inference including point and interval estimates of the unknown regression function as well as the marginal effects of potential predictors. By keeping track of predictor inclusion frequencies, BART can also be used for model-free variable selection. BART’s many features are illustrated with a bake-off against competing methods on 42 different data sets, with a simulation experiment and on a drug discovery classification problem.

Citation

Download Citation

Hugh A. Chipman.Edward I. George.Robert E. McCulloch."BART: Bayesian additive regression trees."Ann. Appl. Stat.4(1)266 - 298,March 2010.https://doi.org/10.1214/09-AOAS285

Information

Published: March 2010
First available in Project Euclid: 11 May 2010

zbMATH:1189.62066
MathSciNet:MR2758172
Digital Object Identifier: 10.1214/09-AOAS285

Keywords: Bayesian backfitting, boosting, CART, ‎classification‎, ensemble, MCMC, Nonparametric regression, probit model, random basis, regularizatio, sum-of-trees model, Variable selection, weak learner

Rights: Copyright © 2010 Institute of Mathematical Statistics

My Library
You currently do not have any folders to save your paper to! Create a new folder below.
Vol.4 • No. 1 • March 2010
Hugh A. Chipman, Edward I. George, Robert E. McCulloch "BART: Bayesian additive regression trees," The Annals of Applied Statistics, Ann. Appl. Stat. 4(1), 266-298, (March 2010)
Include:
Format:
Back to Top

KEYWORDS/PHRASES

Keywords
in
Remove
in
Remove
in
Remove
+ Add another field

PUBLICATION TITLE:


PUBLICATION YEARS

Range
Single Year

Clear Form

[8]ページ先頭

©2009-2026 Movatter.jp