- Notifications
You must be signed in to change notification settings - Fork22
Extra recipes for predictor embeddings
License
Unknown, MIT licenses found
Licenses found
tidymodels/embed
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
embed has extra steps for therecipes package for embeddingpredictors into one or more numeric columns. Almost all of thepreprocessing methods aresupervised.
These steps are available here in a separate package because the stepdependencies,rstanarm,lme4, andkeras3, are fairly heavy.
Some steps handle categorical predictors:
step_lencode_glm(),step_lencode_bayes(), andstep_lencode_mixed()estimate the effect of each of the factorlevels on the outcome and these estimates are used as the newencoding. The estimates are estimated by a generalized linear model.This step can be executed without pooling (viaglm) or with partialpooling (stan_glmorlmer). Currently implemented for numeric andtwo-class outcomes.step_embed()useskeras3::layer_embeddingto translate theoriginalC factor levels into a set ofD new variables (<C).The model fitting routine optimizes which factor levels are mapped toeach of the new variables as well as the corresponding regressioncoefficients (i.e., neural network weights) that will be used as thenew encodings.step_woe()creates new variables based on weight of evidenceencodings.step_feature_hash()can create indicator variables using featurehashing.
For numeric predictors:
step_umap()uses a nonlinear transformation similar to t-SNE but canbe used to project the transformation on new data. Both supervised andunsupervised methods can be used.step_discretize_xgb()andstep_discretize_cart()can make binnedversions of numeric predictors using supervised tree-based models.step_pca_sparse()andstep_pca_sparse_bayes()conduct featureextraction with sparsity of the component loadings.
Some references for these methods are:
- Francois C and Allaire JJ (2018)Deep Learning withR, Manning
- Guo, C and Berkhahn F (2016) “Entity Embeddings of CategoricalVariables”
- Micci-Barreca D (2001) “A preprocessing scheme for high-cardinalitycategorical attributes in classification and predictionproblems,”ACM SIGKDD Explorations Newsletter, 3(1), 27-32.
- Zumel N and Mount J (2017) “
vtreat: adata.frameProcessor forPredictive Modeling” - McInnes L and Healy J (2018)UMAP: Uniform Manifold Approximation andProjection for Dimension Reduction
- Good, I. J. (1985), “Weight of evidence: A briefsurvey”,Bayesian Statistics, 2, pp.249-270.
There are two articles that walk through how to use these embeddingsteps, usinggeneralized linearmodels andneural networks built viaTensorFlow.
To install the package:
install.packages("embed")Note that to use some steps, you will also have to install otherpackages such asrstanarm andlme4. For all of the steps to work,you may want to use:
install.packages(c("rpart","xgboost","rstanarm","lme4"))
To get a bug fix or to use a feature from the development version, youcan install the development version of this package from GitHub.
# install.packages("pak")pak::pak("tidymodels/embed")
This project is released with aContributor Code ofConduct.By contributing to this project, you agree to abide by its terms.
For questions and discussions about tidymodels packages, modeling, andmachine learning, pleasepost on RStudioCommunity.
If you think you have encountered a bug, pleasesubmit anissue.
Either way, learn how to create and share areprex(a minimal, reproducible example), to clearly communicate about yourcode.
Check out further details oncontributing guidelines for tidymodelspackages andhow to gethelp.
About
Extra recipes for predictor embeddings
Resources
License
Unknown, MIT licenses found
Licenses found
Code of conduct
Contributing
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.
