logo IMB
Back

Séminaire Images Optimisation et Probabilités

Stochastic Online Convex Optimization: Fast Convergence Rates via Self-Normalized Martingale Inequalities

Olivier Wintenberger

( LPSM (Sorbonne Université) )

Salle de conférénces

March 26, 2026 at 11:15 AM

This talk is about an extension of the framework of Online Convex Optimization to a stochastic adversarial setting. 

Under the Stochastic Directional Derivative condition, we prove that the Online Newton Step algorithm achieves fast convergence rates. 

A distinguishing feature of our approach is its applicability to non-convex loss functions, significantly broadening its scope. 

We demonstrate the framework’s utility through applications to non-convex losses, such as the likelihood of volatility models. 

The analysis hinges on a fundamental self-normalized martingale inequality, as developed by Bercu and Touati (2008).


This work builds upon Wintenberger (2024), Stochastic Online Convex Optimization; Application to Probabilistic Time Series Forecasting, published in EJS, 18, 429–464.