VA & Opt Webinar: Xiaoqi Yang (Hong Kong PolyU)

Title:  On error bound moduli for locally Lipschitz and regular functions

Speaker: Xiaoqi Yang (Hong Kong PolyU)

Date and Time: August 12th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We first introduce for a closed and convex set two classes of subsets: the near and far ends relative to a point, and give some full characterizations for these end sets by virtue of the face theory of closed and convex sets. We provide some connections between closedness of the far (near) end and the relative continuity of the gauge (cogauge) for closed and convex sets. We illustrate that the distance from 0 to the outer limiting subdifferential of the support function of the subdifferential set, which is essentially the distance from 0 to the end set of the subdifferential set, is an upper estimate of the local error bound modulus. This upper estimate becomes tight for a convex function under some regularity conditions. We show that the distance from 0 to the outer limiting subdifferential set of a lower C^1 function is equal to the local error bound modulus.


References:
Li, M.H., Meng K.W. and Yang X.Q., On far and near ends of closed and convex sets. Journal of Convex Analysis. 27 (2020) 407–421.
Li, M.H., Meng K.W. and Yang X.Q., On error bound moduli for locally Lipschitz and regular functions, Math. Program. 171 (2018) 463–487.

VA & Opt Webinar: Evgeni Nurminski (FEFU)

Title: Practical Projection with Applications

Speaker: Evgeni Nurminski
(FEFU, Vladivostok)

Date and Time: August 5th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Projection of a point on a given set is a very common computational operation in an endless number of algorithms and applications. However, with exception of simplest sets it by itself is a nontrivial operation which is often complicated by large dimension, computational degeneracy, nonuniqueness (even for orthogonal projection on convex sets in certain situations), and so on. This talk aims to present some practical solutions, i.e. finite algorithms, for projection on polyhedral sets, among those: simplex, polytopes, polyhedron, finite-generated cones with a certain discussion of “nonlinearities”, decomposition and parallel computations. We also consider the application of projection operation in linear optimization and epi-projection algorithm for convex optimization.

VA & Opt Webinar: Akiko Takeda (University of Tokyo)

Title: Deterministic and Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization

Speaker: Akiko Takeda (University of Tokyo)

Date and Time: July 29th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Our work focuses on deterministic/stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on stochastic gradient methods is quite limited, and until recently no non-asymptotic convergence results have been reported. After showing a deterministic approach, we present simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms’ performance in practice for empirical risk minimization.


This is based on joint works with  Tianxiang Liu, Ting Kei Pong and Michael R. Metel.

VA & Opt Webinar: Oliver Stein (KIT)

Title: A general branch-and-bound framework for global multiobjective optimization

Speaker: Oliver Stein (KIT)

Date and Time: July 22nd, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We develop a general framework for branch-and-bound methods in multiobjective optimization. Our focus is on natural generalizations of notions and techniques from the single objective case. In particular, after the notions of upper and lower bounds on the globally optimal value from the single objective case have been transferred to upper and lower bounding sets on the set of nondominated points for multiobjective programs, we discuss several possibilities for discarding tests. They compare local upper bounds of the provisional nondominated sets with relaxations of partial upper image sets, where the latter can stem from ideal point estimates, from convex relaxations, or from relaxations by a reformulation-linearization technique.

The discussion of approximation properties of the provisional nondominated set leads to the suggestion for a natural selection rule along with a natural termination criterion. Finally we discuss some issues which do not occur in the single objective case and which impede some desirable convergence properties, thus also motivating a natural generalization of the convergence concept.

This is joint work with Gabriele Eichfelder, Peter Kirst, and Laura Meng.

VA & Opt Webinar: James Saunderson (Monash)

Title: Lifting for simplicity: concise descriptions of convex sets

Speaker: James Saunderson (Monash University)

Date and Time: July 15th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: This talk will give a selective tour through the theory and applications of lifts of convex sets. A lift of a convex set is a higher-dimensional convex set that projects onto the original set. Many interesting convex sets have lifts that are dramatically simpler to describe than the original set. Finding such simple lifts has significant algorithmic implications, particularly for associated optimization problems. We will consider both the classical case of polyhedral lifts, which are described by linear inequalities, as well as spectrahedral lifts, which are defined by linear matrix inequalities. The tour will include discussion of ways to construct lifts, ways to find obstructions to the existence of lifts, and a number of interesting examples from a variety of mathematical contexts. (Based on joint work with H. Fawzi, J. Gouveia, P. Parrilo, and R. Thomas).