---
res:
bibo_abstract:
- "In this article, we show how second-order derivative information can be\r\nincorporated
into gradient sampling methods for nonsmooth optimization. The\r\nsecond-order
information we consider is essentially the set of coefficients of\r\nall second-order
Taylor expansions of the objective in a closed ball around a\r\ngiven point. Based
on this concept, we define a model of the objective as the\r\nmaximum of these
Taylor expansions. Iteratively minimizing this model\r\n(constrained to the closed
ball) results in a simple descent method, for which\r\nwe prove convergence to
minimal points in case the objective is convex. To\r\nobtain an implementable
method, we construct an approximation scheme for the\r\nsecond-order information
based on sampling objective values, gradients and\r\nHessian matrices at finitely
many points. Using a set of test problems, we\r\ncompare the resulting method
to five other available solvers. Considering the\r\nnumber of function evaluations,
the results suggest that the method we propose\r\nis superior to the standard
gradient sampling method, and competitive compared\r\nto other methods.@eng"
bibo_authorlist:
- foaf_Person:
foaf_givenName: Bennet
foaf_name: Gebken, Bennet
foaf_surname: Gebken
foaf_workInfoHomepage: http://www.librecat.org/personId=32643
dct_date: 2022^xs_gYear
dct_language: eng
dct_title: Using second-order information in gradient sampling methods for nonsmooth
optimization@
...