Hierarchical Dirichlet scaling process

Cited 4 time in webofscience Cited 0 time in scopus
  • Hit : 491
  • Download : 103
We present the hierarchical Dirichlet scaling process (HDSP), a Bayesian nonparametric mixed membership model. The HDSP generalizes the hierarchical Dirichlet process to model the correlation structure between metadata in the corpus and mixture components. We construct the HDSP based on the normalized gamma representation of the Dirichlet process, and this construction allows incorporating a scaling function that controls the membership probabilities of the mixture components. We develop two scaling methods to demonstrate that different modeling assumptions can be expressed in the HDSP. We also derive the corresponding approximate posterior inference algorithms using variational Bayes. Through experiments on datasets of newswire, medical journal articles, conference proceedings, and product reviews, we show that the HDSP results in a better predictive performance than labeled LDA, partially labeled LDA, and author topic model and a better negative review classification performance than the supervised topic model and SVM.
Publisher
SPRINGER
Issue Date
2017-03
Language
English
Article Type
Article
Keywords

VARIATIONAL INFERENCE; MIXTURES; MODELS

Citation

MACHINE LEARNING, v.106, no.3, pp.387 - 418

ISSN
0885-6125
DOI
10.1007/s10994-016-5621-5
URI
http://hdl.handle.net/10203/222684
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
000394355700003.pdf(2.75 MB)Download
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 4 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0