Stoyanov et al 2011: Empirical Risk Minimization of Graphical Model Parameters Given Approximate Inference, Decoding, and Model Structure

From Cohen Courses
Jump to navigationJump to search

Citation

Veselin Stoyanov and Alexander Ropson and Jason Eisner, "Empirical Risk Minimization of Graphical Model Parameters Given Approximate Inference, Decoding, and Model Structure", in Proceedings of AISTATS, 2011.

Online version

Stoyanov et al 2011

Summary

This is an overview paper that presents a loopy Belief Propagation and Back Propagation method for Empirical Risk Minimization (ERM), which is an alternative training method for general problems in Probabilistic Graphical Models (e.g. possible applications include Named Entity Recognition, Word Alignment, Shallow Parsing, and Constituent Parsing). The paper formulates the approximate learning problem as an ERM problem, rather than MAP estimation. The authors show that by replacing MAP estimation, the ERM based estimation parameters significantly reduce loss on the test set, even by an order of magnitude.

Brief Description of the method

This paper first formulates the parameter estimation problem as training and decoding on Markov random fields (MRFs), then discusses the use of Belief Propagation to do inference on MRFs and the use of Back Propagation to calculate the gradient of the empirical risk. In this section, we will first summarize the method they use to formulate the problem, then briefly describe the method of using Back Propagation for this task. Regarding the detailed Belief Propagation and Empirical Risk Minimization methods for general probabilistic graphical models, please refer to their corresponding method page.

.

Dataset

Experimental Results

Related Papers

This paper is related to many papers in three dimensions.