VA & Opt Webinar: Akiko Takeda (University of Tokyo)

Title: Deterministic and Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization

Speaker: Akiko Takeda (University of Tokyo)

Date and Time: July 29th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Our work focuses on deterministic/stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on stochastic gradient methods is quite limited, and until recently no non-asymptotic convergence results have been reported. After showing a deterministic approach, we present simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms’ performance in practice for empirical risk minimization.


This is based on joint works with  Tianxiang Liu, Ting Kei Pong and Michael R. Metel.