Sorry, you do not have access to this eBook
A subscription is required to access the full text content of this book.
Although mixed membership models have achieved great success in unsupervised learning, they have not been applied as widely to classification problems. In this chapter, we discuss a family of discriminative mixed membership (DMM) models. By combining unsupervised mixed membership models with multi-class logistic regression, DMM models can be used for classification. In particular, we discuss discriminative latent Dirichlet allocation (DLDA) for text classification and discriminative mixed membership naive Bayes (DMNB) for classification on general feature vectors. Two variation inference algorithms are considered for learning the models, including a fast inference algorithm which uses fewer variational parameters and is substantially more efficient than the standard mean field variational approximation. The efficacy of the models is demonstrated by extensive experiments on multiple datasets.
A subscription is required to access the full text content of this book.
Other ways to access this content: