Gradient Descent

From Cohen Courses
Revision as of 16:52, 31 March 2011 by Ssomanch (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive of gradient of the function we are optimizing.

Wiki Link

Relevant Papers