Mnduong writeup of Dan Roth's Talk
From Cohen Courses
Jump to navigationJump to searchThis is a review of Dan_Roth_talk by user:mnduong.
- The talk introduced Conditional Constrained Model, a framework for learning and inference that allows global constraints to be incorporated. The motivation of the model is that a lot of tasks fall into a pipeline, which if separated would suffer from error propagation from one task to the next. The idea is to apply global constraints so as to use information from one task to help with another.
- The model is formulated as an Integer Linear Programming, where the objective function is a linear combination of local models, and the constraints are Boolean functions defined on partial assignments.
- Constraints are mathematically equivalent to features, but they allow more flexibility in designing. Constraints can be modeled as soft constraints, in which case the goal is to minimize the distance from the actual values to the "legal" values.
- The speaker illustrated the model with application in Semantic Role Labeling and Relation Extraction, where it helped to incorporate results from different subtasks to achieve a good global performance.
- Overall, I think the talk was easy to follow, and problems were clearly motivated. The use of constraints is quite elegant, and it doesn't seem to add much complexity.