Heckerman, JMLR 2000

From Cohen Courses
Jump to navigationJump to search

Citation

Dependency Networks for Inference, Collaborative Filtering, and Data Visualization. David Heckerman, David Maxwell Chickering, Christopher Meek, Robert Rounthwaite, Carl Kadie; in JMLR, 1(Oct):49-75, 2000

Online version

[1]

Summary

In this paper, author describe a graphical model for probabilistic relationship, an alternative to the bayesian network, called dependency network. The dependency network, unlike bayesian network is potentially cyclic. The dependency network are well suited to task of predicting preferences like in collaborative filtering. The dependency network is not good for encoding causal relationship.

Brief description of the method

  • Consistent Dependency Network

Given a domain of interest having set of finite variables with a positive joint distribution , a consistent dependency network for is a pair where is a cyclic directed graph and is a set of conditional probability distributions. The parents of node , denoted , correspond to those variable that satisfy:

The dependency network is consistent in the sense that each local distribution can be obtained via inference from the joint distribution .

The inference can be perform by converting to a markov network, triangulating that network and then applying one of the standard algorithms for the inference.

  • General Dependency Network

Given a domain of interest having set of finite variables , let be a set of conditional distribution, one for each variable in . A dependency network for and is a pair where is a (usually cyclic) directed graph and is a set of conditional probability distribution satisfying

Experimental Result

They experimented it on the problem of collaborative filtering on three datasets - Nielsen, MS.COM and MSNBC. Overall Bayesian network are slightly more accurate but dependency network are significantly faster at prediction.