Eisenstein et al 2011: Sparse Additive Generative Models of Text

From Cohen Courses
Revision as of 19:48, 28 November 2011 by Yww (talk | contribs)
Jump to navigationJump to search

Citation

Sparse Additive Generative Models of Text. Eisenstein, Ahmed and Xing. Proceedings of ICML 2011.

Online version

Eisenstein et al 2011

Summary

This recent paper presents sparse learning and additive generative modeling approaches for Topic modeling. This is an important alternative approach to Latent Dirichlet Allocation (LDA) where sparsity and log-space additive modeling are NOT considered or introduced.

Brief Description of the method

This paper first describes three big disadvantages of Latent Dirichlet Allocation: high inference cost, overparameterization, and lack of sparsity representation. Then, it introduces SAGE, an additive generative model which does not require learning the same background distribution again and again, but rather introduces a sparse topic model that performs addition in log-space.

The Generative Story

Parameter Estimation

Dataset and Experiment Settings

[[File:]]

Experimental Results

The authors performed three major experiments. The first experiment is the . The second experiment explores.

Exp

[[File:]]

Exp

[[File:]]

Exp

[[File:]]

Related Papers

This paper is related to many papers in three dimensions.

(1) .

(2) .

(3) .