Résumé
We leverage path differentiability and a recent result on nonsmooth implicit differentiation calculus to give sufficient conditions ensuring that the solution to a monotone inclusion problem will be path differentiable, with formulas for computing its generalized gradient. A direct consequence of our result is that these solutions happen to be differentiable almost everywhere. Our approach is fully compatible with automatic differentiation and comes with the following assumptions which are easy to check (roughly speaking): semialgebraicity and strong monotonicity. We illustrate the scope of our results by considering the following three fundamental composite problem settings: strongly convex problems, dual solutions to convex minimization problems, and primal-dual solutions to min-max problems.
Remplace
Jérôme Bolte, Tam Le, Edouard Pauwels et Antonio Silveti-Falls, « Nonsmooth Implicit Differentiation for Machine Learning and Optimization », TSE Working Paper, n° 22-1314, mars 2022.
Référence
Jérôme Bolte, Edouard Pauwels et Antonio Silveti-Falls, « Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems », SIAM Journal on Optimization, vol. 34, n° 1, 2024.
Publié dans
SIAM Journal on Optimization, vol. 34, n° 1, 2024