Difference between revisions of "Gradient Descent"
From Cohen Courses
Jump to navigationJump to search (Created page with 'Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive …') |
|||
| Line 1: | Line 1: | ||
Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive of gradient of the function we are optimizing. | Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive of gradient of the function we are optimizing. | ||
| + | |||
[http://en.wikipedia.org/wiki/Gradient_descent Wiki Link] | [http://en.wikipedia.org/wiki/Gradient_descent Wiki Link] | ||
| + | |||
| + | == Relevant Papers == | ||
| + | |||
| + | {{#ask: [[UsesMethod::Gradient Descent]] | ||
| + | | ?AddressesProblem | ||
| + | | ?UsesDataset | ||
| + | }} | ||
Latest revision as of 15:52, 31 March 2011
Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive of gradient of the function we are optimizing.