Title: Efficient Training of LDA on a GPU by Mean-for-Mode Estimation
Abstract: We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler--and unlike an uncollapsed Gibbs sampler--it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like an uncollapsed Gibbs sampler--and unlike a collapsed Gibbs sampler --it is embarrassingly parallel, and can use approximate counters.
Publication Year: 2015
Publication Date: 2015-07-06
Language: en
Type: article
Access and Citation
Cited By Count: 15
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot