Difference between revisions of "Stochastic Gradient Descent"

From Cohen Courses
Jump to navigationJump to search
Line 1: Line 1:
 
== Summary ==
 
== Summary ==
  
This is an optimization [[Category::method]], used in many algorithms such as [[AddressesMethod::Conditional Random Field]] to efficiently optimize the objective function, especially in the online setting.
+
This is an optimization [[Category::method]], used in many algorithms such as [[AddressesProblem::Conditional Random Field]] to efficiently optimize the objective function, especially in the online setting.
  
  

Revision as of 12:23, 30 September 2011

Summary

This is an optimization method, used in many algorithms such as Conditional Random Field to efficiently optimize the objective function, especially in the online setting.


Problem formulation

Forward-backward

Related Concepts