Eisenstein et al 2011: Sparse Additive Generative Models of Text

From Cohen Courses
Jump to navigationJump to search

Citation

Sparse Additive Generative Models of Text. Eisenstein, Ahmed and Xing. Proceedings of ICML 2011.

Online version

Eisenstein et al 2011

Summary

This recent paper presents sparse learning and additive generative modeling approaches for Topic modeling. This is an important alternative approach to Latent Dirichlet Allocation (LDA) where sparsity and log-space additive modeling are NOT considered or introduced.

Brief Description of the method

This paper first describes three big disadvantages of Latent Dirichlet Allocation: high inference cost, overparameterization, and lack of sparsity representation. Then, it introduces SAGE, an additive generative model which does not require learning the same background distribution again and again, but rather introduces a sparse topic model that performs addition in log-space.

The Generative Story

In contrast to traditional multinomial modeling of words in LDA, SAGE looks at log frequencies and the generative distribution of words in a document d is

where is the background and is the log frequency deviation that represents topic. By doing this, the authors argue that SAGE can take advantages of sparsity-inducing priors on to obtain additional robustness for the model. The generative story of SAGE can be described as follows

  • Draw background distribution m from an uninformative prior
  • For each class k:
     -- For each term i      
          1. Draw 
          2. Draw 
     -- Set 
  • For each document d:
     -- Draw a class  from uniform distribution
     -- For each word n, draw 

Here, indicates the exponential distribution. If we fit a variational distribution over the latent variables, optimizing the bound, we can get the following likelihood equation

Failed to parse (syntax error): {\displaystyle l' = \sum_d \sum_n^{N_d} log P(w_{n}^{(d)}|m, \eta_{y_{d}}) + \sum_k \left\langle \big\rangle }

Parameter Estimation

Dataset and Experiment Settings

[[File:]]

Experimental Results

The authors performed three major experiments. The first experiment is the . The second experiment explores.

Exp

[[File:]]

Exp

[[File:]]

Exp

[[File:]]

Related Papers

This paper is related to many papers in three dimensions.

(1) .

(2) .

(3) .