Gradient Descent

From Cohen Courses
Jump to navigationJump to search

Gradient Descent is a method to find local minimum or maximum of an optimization of a function, in which we steps are taken in direction and proportional to negative or positive of gradient of the function we are optimizing.

Wiki Link

Relevant Papers