VA & Opt Webinar: Jein-Shan Chen (NTNU)

Title: Two approaches for absolute value equation by using smoothing functions

Speaker: Jein-Shan Chen (NTNU)

Date and Time: August 26th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In this talk, we present two approaches for solving absolute value equation.These two approaches are based on using some smoothing function. In particular, thereare several systematic ways of constructing smoothing functions. Numerical experimentswith comparisons are reported, which suggest what kinds of smoothing functions workwell along with the proposed approaches.

VA & Opt Webinar: Hieu Thao Nguyen (TU Delft)

Title:  Projection Algorithms for Phase Retrieval with High Numerical Aperture

Speaker: Hieu Thao Nguyen (TU Delft)

Date and Time: August 19th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We develop the mathematical framework in which the class of projection algorithms can be applied to high numerical aperture (NA) phase retrieval. Within this framework we first analyze the basic steps of solving this problem by projection algorithms and establish the closed forms of all the relevant prox-operators. We then study the geometry of the high-NA phase retrieval problem and the obtained results are subsequently used to establish convergence criteria of projection algorithms. Making use of the vectorial point-spread-function (PSF) is, on the one hand, the key difference between this work and the literature of phase retrieval mathematics which mostly deals with the scalar PSF. The results of this paper, on the other hand, can be viewed as extensions of those concerning projection methods for low-NA phase retrieval. Importantly, the improved performance of projection methods over the other classes of phase retrieval algorithms in the low-NA setting now also becomes applicable to the high-NA case. This is demonstrated by the accompanying numerical results which show that all available solution approaches for high-NA phase retrieval are outperformed by projection methods.

VA & Opt Webinar: Xiaoqi Yang (Hong Kong PolyU)

Title:  On error bound moduli for locally Lipschitz and regular functions

Speaker: Xiaoqi Yang (Hong Kong PolyU)

Date and Time: August 12th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We first introduce for a closed and convex set two classes of subsets: the near and far ends relative to a point, and give some full characterizations for these end sets by virtue of the face theory of closed and convex sets. We provide some connections between closedness of the far (near) end and the relative continuity of the gauge (cogauge) for closed and convex sets. We illustrate that the distance from 0 to the outer limiting subdifferential of the support function of the subdifferential set, which is essentially the distance from 0 to the end set of the subdifferential set, is an upper estimate of the local error bound modulus. This upper estimate becomes tight for a convex function under some regularity conditions. We show that the distance from 0 to the outer limiting subdifferential set of a lower C^1 function is equal to the local error bound modulus.


References:
Li, M.H., Meng K.W. and Yang X.Q., On far and near ends of closed and convex sets. Journal of Convex Analysis. 27 (2020) 407–421.
Li, M.H., Meng K.W. and Yang X.Q., On error bound moduli for locally Lipschitz and regular functions, Math. Program. 171 (2018) 463–487.

VA & Opt Webinar: Evgeni Nurminski (FEFU)

Title: Practical Projection with Applications

Speaker: Evgeni Nurminski
(FEFU, Vladivostok)

Date and Time: August 5th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Projection of a point on a given set is a very common computational operation in an endless number of algorithms and applications. However, with exception of simplest sets it by itself is a nontrivial operation which is often complicated by large dimension, computational degeneracy, nonuniqueness (even for orthogonal projection on convex sets in certain situations), and so on. This talk aims to present some practical solutions, i.e. finite algorithms, for projection on polyhedral sets, among those: simplex, polytopes, polyhedron, finite-generated cones with a certain discussion of “nonlinearities”, decomposition and parallel computations. We also consider the application of projection operation in linear optimization and epi-projection algorithm for convex optimization.

VA & Opt Webinar: Akiko Takeda (University of Tokyo)

Title: Deterministic and Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization

Speaker: Akiko Takeda (University of Tokyo)

Date and Time: July 29th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Our work focuses on deterministic/stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on stochastic gradient methods is quite limited, and until recently no non-asymptotic convergence results have been reported. After showing a deterministic approach, we present simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms’ performance in practice for empirical risk minimization.


This is based on joint works with  Tianxiang Liu, Ting Kei Pong and Michael R. Metel.

VA & Opt Webinar: Oliver Stein (KIT)

Title: A general branch-and-bound framework for global multiobjective optimization

Speaker: Oliver Stein (KIT)

Date and Time: July 22nd, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We develop a general framework for branch-and-bound methods in multiobjective optimization. Our focus is on natural generalizations of notions and techniques from the single objective case. In particular, after the notions of upper and lower bounds on the globally optimal value from the single objective case have been transferred to upper and lower bounding sets on the set of nondominated points for multiobjective programs, we discuss several possibilities for discarding tests. They compare local upper bounds of the provisional nondominated sets with relaxations of partial upper image sets, where the latter can stem from ideal point estimates, from convex relaxations, or from relaxations by a reformulation-linearization technique.

The discussion of approximation properties of the provisional nondominated set leads to the suggestion for a natural selection rule along with a natural termination criterion. Finally we discuss some issues which do not occur in the single objective case and which impede some desirable convergence properties, thus also motivating a natural generalization of the convergence concept.

This is joint work with Gabriele Eichfelder, Peter Kirst, and Laura Meng.

VA & Opt Webinar: James Saunderson (Monash)

Title: Lifting for simplicity: concise descriptions of convex sets

Speaker: James Saunderson (Monash University)

Date and Time: July 15th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: This talk will give a selective tour through the theory and applications of lifts of convex sets. A lift of a convex set is a higher-dimensional convex set that projects onto the original set. Many interesting convex sets have lifts that are dramatically simpler to describe than the original set. Finding such simple lifts has significant algorithmic implications, particularly for associated optimization problems. We will consider both the classical case of polyhedral lifts, which are described by linear inequalities, as well as spectrahedral lifts, which are defined by linear matrix inequalities. The tour will include discussion of ways to construct lifts, ways to find obstructions to the existence of lifts, and a number of interesting examples from a variety of mathematical contexts. (Based on joint work with H. Fawzi, J. Gouveia, P. Parrilo, and R. Thomas).

UNSW Seminar: Tiangang Cui (Monash)

Title: Tensorised Rosenblatt Transport for High-Dimensional Stochastic Computation

Speaker: Tiangang Cui (Monash University)

Date: Tue, 07/07/2020 – 11:05am

Venue: Zoom meeting (connection details here)

Abstract: 

Characterising intractable high-dimensional random variables is one of the fundamental challenges in stochastic computation. It has broad applications in statistical physics, machine learning, uncertainty quantification, econometrics, and beyond. The recent surge of transport maps offers a mathematical foundation and new insights for tackling this challenge.

In this talk, we present a functional tensor-train (FTT) based monotonicity-preserving construction of inverse Rosenblatt transport in high dimensions. It characterises intractable random variables via couplings with tractable reference random variables. By integrating our FTT-based approach into a nested approximation framework inspired by deep neural networks, we are able to significantly expand its capability to random variables with complicated nonlinear interactions and concentrated density functions. We demonstrate the efficacy of the FTT-based inverse Rosenblatt transport on a range of applications in statistical learning and uncertainty quantification, including parameter estimation for dynamical systems, PDE-constrained inverse problems, and Bayesian filtering.

This is joint work with Dr. Sergey Dolgov (Bath) and Mr. Yiran Zhao (Monash)

VA & Opt Webinar: Hoa Bui (Curtin University)

Title: Zero Duality Gap Conditions via Abstract Convexity.

Speaker: Hoa Bui (Curtin University)

Date and Time: July 8th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Using tools provided by the theory of abstract convexity, we extend conditions for zero duality gap to the context of nonconvex and nonsmooth optimization. Substituting the classical setting, an abstract convex function is the upper envelope of a subset of a family of abstract affine functions (being conventional vertical translations of the abstract linear functions). We establish new characterizations of the zero duality gap under no assumptions on the topology on the space of abstract linear functions. Endowing the latter space with the topology of pointwise convergence, we extend several fundamental facts of the conventional convex analysis. In particular, we prove that the zero duality gap property can be stated in terms of an inclusion involving 𝜀-subdifferentials, which are shown to possess a sum rule. These conditions are new even in conventional convex cases. The Banach-Alaoglu-Bourbaki theorem is extended to the space of abstract linear functions. The latter result extends a fact recently established by Borwein, Burachik and Yao in the conventional convex case.

This talk is based on a joint work with Regina Burachik, Alex Kruger and David Yost.

VA & Opt Webinar: Marián Fabian (Czech Academy of Sciences, Prague)

Title: Can Pourciau’s open mapping theorem be derived from Clarke’s inverse mapping theorem?

Speaker: Marián Fabian (Math Institute of Czech Academy of Sciences, Prague)

Date and Time: July 1st, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We discuss the possibility of deriving Pourciau’s open mapping theorem from Clarke’s inverse mapping theorem. These theorems work with the Clarke generalized Jacobian. In our journey, we will face several interesting phenomena and pitfalls in the world of (just) 2 by 3 matrices.

1 6 7 8 9 10