Relational topic model
The relational topic model (RTM) extends the LDA Topic model by modeling the presence of (observed) links between documents. The presence of a link between documents depends on the similarity of their respective topics. An eta vector allows the model to capture the fact that some topics are more important than others in predicting links. For example, if there is a separate topic for spam and another one for general conversation, then documents that talk about spam are more likely to link to each other than documents that talk about general conversation. Thus, the link-prediction is a logistic regression with weight vector eta, and a measure of similarity of the two documents, using Hadamad product of the topic weights.
The RTM uses variational inference to infer a posterior over the hidden variables. Variational EM is then used to learn the parameter values. It models only undirected links, and the inference complexity scales with the number of links (instead of number of document pairs), thus enabling efficient inference when the citation graph is sparse.