logo IMB
Retour

Séminaire des doctorant·es

Alternate gradient schemes for nonconvex-strongly concave minmax problems

Guido Samuel TAPIA RIERA

( IMB - IOP team )

Salle de conférences

08 avril 2026 à 17:00

We present a new algorithm for min-max optimization problems of the form minmaxy ϕ(xy) − h(y), where ϕ has a Lipschitz-continuous gradient, is weakly convex in x

and concave in y, and either ϕ or h is strongly concave in y. Problems of this type arise in

several applications, including generative adversarial networks (GANs), online learning, and

deep learning.

State-of-the-art optimization methods for such problems typically rely on alternating explicit

gradient descent-ascent steps applied to the coupling term ϕ. In this work, we prove

convergence of such alternating strategies with relaxed conditions on the step-sizes compared

to prior work. We also introduce a proximal alternating algorithm and show similar results.

This is particularly useful for minimizing prox-friendly objective functions, coming from

explicit regularizations or Plug and Play applications.