[edit]
A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation
Frank Wood, Yee Whye TehProceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, PMLR 5:607-614, 2009.
Abstract
In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the “adaptation” of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.
Cite this Paper
BibTeX
@InProceedings{pmlr-v5-wood09a, title = {A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation}, author = {Wood, Frank and Teh, Yee Whye}, booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics}, pages = {607--614}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/wood09a/wood09a.pdf}, url = {https://proceedings.mlr.press/v5/wood09a.html}, abstract = {In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the “adaptation” of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.}}
Endnote
%0 Conference Paper%T A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation%A Frank Wood%A Yee Whye Teh%B Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics%C Proceedings of Machine Learning Research%D 2009%E David van Dyk%E Max Welling%F pmlr-v5-wood09a%I PMLR%P 607--614%U https://proceedings.mlr.press/v5/wood09a.html%V 5%X In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the “adaptation” of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.
RIS
TY - CPAPERTI - A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain AdaptationAU - Frank WoodAU - Yee Whye TehBT - Proceedings of the Twelfth International Conference on Artificial Intelligence and StatisticsDA - 2009/04/15ED - David van DykED - Max WellingID - pmlr-v5-wood09aPB - PMLRDP - Proceedings of Machine Learning ResearchVL - 5SP - 607EP - 614L1 - http://proceedings.mlr.press/v5/wood09a/wood09a.pdfUR - https://proceedings.mlr.press/v5/wood09a.htmlAB - In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the “adaptation” of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.ER -
APA
Wood, F. & Teh, Y.W.. (2009). A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation.Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, inProceedings of Machine Learning Research 5:607-614 Available from https://proceedings.mlr.press/v5/wood09a.html.