Difference between revisions of "Stochastic Gradient Descent"

From Cohen Courses
Jump to navigationJump to search
(Created page with '(Draft coming soon from Daegun Won)')
 
Line 1: Line 1:
(Draft coming soon from Daegun Won)
+
== Summary ==
 +
 
 +
This is an optimization [[Category::method]], used in many algorithms such as [[AddressesMethod::Conditional Random Field]] to efficiently optimize the objective function, especially in the online setting.
 +
 
 +
 
 +
== Problem formulation ==
 +
 
 +
 
 +
 
 +
== Forward-backward ==
 +
 
 +
== Related Concepts ==

Revision as of 13:22, 30 September 2011

Summary

This is an optimization method, used in many algorithms such as Conditional Random Field to efficiently optimize the objective function, especially in the online setting.


Problem formulation

Forward-backward

Related Concepts