Difference between revisions of "Subgradient"
(Created page with 'This is an [[category::method|algorithm]]') |
|||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
− | This is an [[ | + | == Summary == |
+ | This [[category::method]] is an iterative method for solving convex optimization problems. Subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, subgradient methods for unconstrained problems use the same search direction as the method of steepest descent. | ||
+ | |||
+ | Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks. | ||
+ | |||
+ | In recent years, some interior-point methods have been suggested for convex minimization problems, but subgradient projection methods and related bundle methods of descent remain competitive. For convex minimization problems with very large number of dimensions, subgradient-projection methods are suitable, because they require little storage. | ||
+ | |||
+ | Subgradient projection methods are often applied to large-scale problems with decomposition techniques. Such decomposition methods often allow a simple distributed method for a problem. | ||
+ | |||
+ | == References / Links == | ||
+ | * Wikipedia article on Subgradient Methods- [http://en.wikipedia.org/wiki/Subgradient_method] | ||
+ | |||
+ | == Relevant Papers == | ||
+ | |||
+ | {{#ask: [[UsesMethod::Subgradient]] | ||
+ | }} |
Latest revision as of 18:18, 29 November 2011
Summary
This method is an iterative method for solving convex optimization problems. Subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, subgradient methods for unconstrained problems use the same search direction as the method of steepest descent.
Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.
In recent years, some interior-point methods have been suggested for convex minimization problems, but subgradient projection methods and related bundle methods of descent remain competitive. For convex minimization problems with very large number of dimensions, subgradient-projection methods are suitable, because they require little storage.
Subgradient projection methods are often applied to large-scale problems with decomposition techniques. Such decomposition methods often allow a simple distributed method for a problem.
References / Links
- Wikipedia article on Subgradient Methods- [1]