Reisinger et al 2010: Spherical Topic Models

From Cohen Courses
Revision as of 15:53, 28 November 2011 by Yww (talk | contribs)
Jump to navigationJump to search

Citation

Joseph Reisinger, Austin Waters, Bryan Silverthorn, and Raymond J. Mooney, "Spherical Topic Models", in Proceedings of the 27th International Conference on Machine Learning (ICML 2010), 2010.

Online version

Reisinger et al 2010

Summary

This is a recent paper that presents Spherical Mixture Model and Variational Inference methods for Latent Dirichlet Allocation (LDA), which is a Bayesian generative model for general problems in Topic modeling. The highlight of this paper is that it models documents as data points in high-dimensional spherical manifold. Like cosine similarity, the model assumes the data is directional, and can be parameterized by cosine distance and other similarity measures in directional statistics. The authors claim that the spherical topic modeling approach outperforms existing models such as LDA.

Motivations

Traditional topic modeling methods, such as Latent Dirichlet Allocation (LDA), fail to model the presence and the absence of words in the target document, because they assume multinomial distribution for document likelihood. To overcome this issue, the authors propose the Spherical Admixture Model, which models both the frequency as well as the presence and absence of the words. In addition to this, by assuming von Mises-Fisher distribution, they hope to improve the system accuracy when using high-dimensional spherical modeling methods for sparse text data.

Brief Description of the method

This paper first introduces the advantages of von Mises-Fisher distribution for text, then discusses the Spherical Admixture Model and the use of Variational Inference to solve the posterior approximation problem. In this section, we will first summarize the basic characteristics of von Mises-Fisher distribution they assume, then we will introduce the definition of the proposed model, as well as the variational inference method.

von Mises-Fisher Distribution

In LDA, the multinomial distribution of words assigns probabilities to integer vectors of event counts, which is the raw counts of each words in a document in . In contrast to multimonial distribution, von Mises-Fisher (vMF) distribution is a probability distribution on the (d-1)-dimensional sphere in , where its density function is

where is the mean direction with , and is the concentration parameter. In addition,

is the normalization factor, where is the modified Bessel function of the first kind and order .

Intuitively, vMF distribution can be considered as the multivariate Gaussian with spherical covariance, parametermized by the cosine distance rather than Euclidean distance. Cosine distance is commonly used in directional statistics and computes the directions of -normalized features vectors and corresponds to the normalized correlation coefficient.

In this paper, the authors also argue that vMF is sensitive to the absence/presence of words, where multinomial distribution is not. They showed an example: if document D1 has a vector of [1,1,1] and document D2 has a vector of [3,0,0], in multinomial scenario where topic proportion , the two documents are equivalent. In contrast, vMF would compute different cosine distances.

The Spherical Admixture Model

Lda sam.png

The Spherical Admixture Model(SAM) is very different from LDA in the sense that it does not model each word given a topic distribution . Instead, it models the document, and uses a weighted directional average to combine topics. A simple generative story of SAM can be given by:

  • Draw a set of T topics on the unit hypersphere.
  • For each document d, draw topic weights from Dirichlet .
  • Draw a document vector from vMF with mean

The complete model can be represented as the following:

  • (corpus mean)
  • (topics)
  • (topic proportions)
  • (spherical average)
  • (documents)

Variational Inference

Dataset and Experiment Settings

[[File:]]

Experimental Results

The authors performed three major experiments. The first experiment is the . The second experiment explores.

Exp

[[File:]]

Exp

[[File:]]

Exp

[[File:]]

Related Papers

This paper is related to many papers in three dimensions.

(1) .

(2) .

(3) .