Difference between revisions of "Class Meeting for 10-707 9/23/2009"
From Cohen Courses
Jump to navigationJump to search
m (1 revision) |
|
(No difference)
|
Latest revision as of 10:42, 3 September 2010
This is one of the class meetings on the schedule for the course Information Extraction 10-707 in Fall 2009.
Linear-chain CRFs
Required Readings
- Shallow parsing with conditional random fields, by F. Sha, F. Pereira. In Proceedings of HLT-NAACL, 2003.
- Conditional structure versus conditional estimation in NLP models, by D. Klein, C. D Manning. In Proceedings of the ACL-02 conference on Empirical methods in natural language processing-Volume 10, 2002.
Optional Readings
- Hidden Markov Models for Labeled Sequences, Krogh 1994. The method of this paper appears to be equivalent to linear-chain CRFs - so why didn't it catch on?
- Gradient tree boosting for training CRFs, Dietterich et al, ICML 2004. A very different training method for CRFs, based on regression trees.
- Semi-Supervised Conditional Random Fields for Improved Sequence Segmentation and Labeling, Jiao et al, ACL 2006. A very nice paper from the UofA group on semi-supervised CRF learning.
- Accelerated Training of Conditional Random Fields with Stochastic Gradient Methods, Vishwanathan et al, ICML 2006. CRF learning methods seem complicated - this paper shows that stochastic gradient methods, a class of very simple on-line methods, can be competitive.
Background
- Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data, Lafferty et al, 2001. The original CRF paper.
- An Introduction to Conditional Random Fields for Relational Learning. A longish tutorial overview of CRFs.