Difference between revisions of "Gradient Descent"
From Cohen Courses
Jump to navigationJump to search (Created page with 'Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive …') |
(No difference)
|
Revision as of 15:51, 31 March 2011
Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive of gradient of the function we are optimizing. Wiki Link