Difference between revisions of "Empirical Risk Minimization"
From Cohen Courses
Jump to navigationJump to searchLine 1: | Line 1: | ||
This is a [[Category::method]] proposed by [[RelatedPaper::Bahl et al. 1988 A new algorithm for the estimation of hidden Markov model parameters]]. | This is a [[Category::method]] proposed by [[RelatedPaper::Bahl et al. 1988 A new algorithm for the estimation of hidden Markov model parameters]]. | ||
− | : | + | In graphical models, Empirical Risk Minimization is an interesting training method that does not aim at maximizing the likelihood on training data. It has the following advantages: |
− | * . | + | * It might prevent overfitting the training data. |
− | * . | + | * Summing up the local conditional likelihood might be more resilient to errors than calculating the product of conditional likelihoods. |
== Motivation == | == Motivation == |
Revision as of 13:58, 1 November 2011
This is a method proposed by Bahl et al. 1988 A new algorithm for the estimation of hidden Markov model parameters.
In graphical models, Empirical Risk Minimization is an interesting training method that does not aim at maximizing the likelihood on training data. It has the following advantages:
- It might prevent overfitting the training data.
- Summing up the local conditional likelihood might be more resilient to errors than calculating the product of conditional likelihoods.
Contents
Motivation
Problem Formulation
Empirical Risk Minimization
Some Reflections
Related Papers
- ()
- ()
- ()
- ()