Difference between revisions of "Stoyanov et al 2011: Empirical Risk Minimization of Graphical Model Parameters Given Approximate Inference, Decoding, and Model Structure"
Line 9: | Line 9: | ||
== Summary == | == Summary == | ||
− | This is an overview [[Category::paper]] that presents a loopy [[UsesMethod::Belief Propagation]] and [[UsesMethod::Back Propagation]] method for [[AddressesProblem::Empirical Risk Minimization]] (ERM), which is an alternative training method for general problems in [[AddressesProblem:: Probabilistic Graphical Models]] (e.g. possible applications include [[AddressesProblem::Named Entity Recognition]], [[AddressesProblem::Word Alignment]], [[AddressesProblem::Shallow Parsing]], and [[AddressesProblem::Constituent Parsing]]). The paper formulates the approximate learning problem as an ERM problem, rather than MAP estimation. The authors show that . | + | This is an overview [[Category::paper]] that presents a loopy [[UsesMethod::Belief Propagation]] and [[UsesMethod::Back Propagation]] method for [[AddressesProblem::Empirical Risk Minimization]] (ERM), which is an alternative training method for general problems in [[AddressesProblem:: Probabilistic Graphical Models]] (e.g. possible applications include [[AddressesProblem::Named Entity Recognition]], [[AddressesProblem::Word Alignment]], [[AddressesProblem::Shallow Parsing]], and [[AddressesProblem::Constituent Parsing]]). The paper formulates the approximate learning problem as an ERM problem, rather than MAP estimation. The authors show that by replacing MAP estimation, the ERM based estimation parameters significantly reduce loss on the test set, even by an order of magnitude. |
== Brief Description of the Method == | == Brief Description of the Method == |
Revision as of 00:10, 3 November 2011
Contents
Citation
Veselin Stoyanov and Alexander Ropson and Jason Eisner, "Empirical Risk Minimization of Graphical Model Parameters Given Approximate Inference, Decoding, and Model Structure", in Proceedings of AISTATS, 2011.
Online version
Summary
This is an overview paper that presents a loopy Belief Propagation and Back Propagation method for Empirical Risk Minimization (ERM), which is an alternative training method for general problems in Probabilistic Graphical Models (e.g. possible applications include Named Entity Recognition, Word Alignment, Shallow Parsing, and Constituent Parsing). The paper formulates the approximate learning problem as an ERM problem, rather than MAP estimation. The authors show that by replacing MAP estimation, the ERM based estimation parameters significantly reduce loss on the test set, even by an order of magnitude.
Brief Description of the Method
This paper first introduces the method of formulating the xxx problem as training and decoding on Markov random fields, then discusses the use of Belief Propagation to xxx during training and decoding. In this section, we will first summarize the method they use to formulate the problem, then briefly describe the method of using xxx for this task. Regarding the detailed xxx method for general probabilistic graphical models, please refer to the method page: Empirical Risk Minimization.
.
Dataset
Experimental Results
Related Papers
This paper is related to many papers in three dimensions.