VA & Opt Webinar: Oliver Stein (KIT)

Title: A general branch-and-bound framework for global multiobjective optimization

Speaker: Oliver Stein (KIT)

Date and Time: July 22nd, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We develop a general framework for branch-and-bound methods in multiobjective optimization. Our focus is on natural generalizations of notions and techniques from the single objective case. In particular, after the notions of upper and lower bounds on the globally optimal value from the single objective case have been transferred to upper and lower bounding sets on the set of nondominated points for multiobjective programs, we discuss several possibilities for discarding tests. They compare local upper bounds of the provisional nondominated sets with relaxations of partial upper image sets, where the latter can stem from ideal point estimates, from convex relaxations, or from relaxations by a reformulation-linearization technique.

The discussion of approximation properties of the provisional nondominated set leads to the suggestion for a natural selection rule along with a natural termination criterion. Finally we discuss some issues which do not occur in the single objective case and which impede some desirable convergence properties, thus also motivating a natural generalization of the convergence concept.

This is joint work with Gabriele Eichfelder, Peter Kirst, and Laura Meng.

VA & Opt Webinar: James Saunderson (Monash)

Title: Lifting for simplicity: concise descriptions of convex sets

Speaker: James Saunderson (Monash University)

Date and Time: July 15th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: This talk will give a selective tour through the theory and applications of lifts of convex sets. A lift of a convex set is a higher-dimensional convex set that projects onto the original set. Many interesting convex sets have lifts that are dramatically simpler to describe than the original set. Finding such simple lifts has significant algorithmic implications, particularly for associated optimization problems. We will consider both the classical case of polyhedral lifts, which are described by linear inequalities, as well as spectrahedral lifts, which are defined by linear matrix inequalities. The tour will include discussion of ways to construct lifts, ways to find obstructions to the existence of lifts, and a number of interesting examples from a variety of mathematical contexts. (Based on joint work with H. Fawzi, J. Gouveia, P. Parrilo, and R. Thomas).

UNSW Seminar: Tiangang Cui (Monash)

Title: Tensorised Rosenblatt Transport for High-Dimensional Stochastic Computation

Speaker: Tiangang Cui (Monash University)

Date: Tue, 07/07/2020 – 11:05am

Venue: Zoom meeting (connection details here)

Abstract: 

Characterising intractable high-dimensional random variables is one of the fundamental challenges in stochastic computation. It has broad applications in statistical physics, machine learning, uncertainty quantification, econometrics, and beyond. The recent surge of transport maps offers a mathematical foundation and new insights for tackling this challenge.

In this talk, we present a functional tensor-train (FTT) based monotonicity-preserving construction of inverse Rosenblatt transport in high dimensions. It characterises intractable random variables via couplings with tractable reference random variables. By integrating our FTT-based approach into a nested approximation framework inspired by deep neural networks, we are able to significantly expand its capability to random variables with complicated nonlinear interactions and concentrated density functions. We demonstrate the efficacy of the FTT-based inverse Rosenblatt transport on a range of applications in statistical learning and uncertainty quantification, including parameter estimation for dynamical systems, PDE-constrained inverse problems, and Bayesian filtering.

This is joint work with Dr. Sergey Dolgov (Bath) and Mr. Yiran Zhao (Monash)

VA & Opt Webinar: Hoa Bui (Curtin University)

Title: Zero Duality Gap Conditions via Abstract Convexity.

Speaker: Hoa Bui (Curtin University)

Date and Time: July 8th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Using tools provided by the theory of abstract convexity, we extend conditions for zero duality gap to the context of nonconvex and nonsmooth optimization. Substituting the classical setting, an abstract convex function is the upper envelope of a subset of a family of abstract affine functions (being conventional vertical translations of the abstract linear functions). We establish new characterizations of the zero duality gap under no assumptions on the topology on the space of abstract linear functions. Endowing the latter space with the topology of pointwise convergence, we extend several fundamental facts of the conventional convex analysis. In particular, we prove that the zero duality gap property can be stated in terms of an inclusion involving 𝜀-subdifferentials, which are shown to possess a sum rule. These conditions are new even in conventional convex cases. The Banach-Alaoglu-Bourbaki theorem is extended to the space of abstract linear functions. The latter result extends a fact recently established by Borwein, Burachik and Yao in the conventional convex case.

This talk is based on a joint work with Regina Burachik, Alex Kruger and David Yost.

VA & Opt Webinar: Marián Fabian (Czech Academy of Sciences, Prague)

Title: Can Pourciau’s open mapping theorem be derived from Clarke’s inverse mapping theorem?

Speaker: Marián Fabian (Math Institute of Czech Academy of Sciences, Prague)

Date and Time: July 1st, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We discuss the possibility of deriving Pourciau’s open mapping theorem from Clarke’s inverse mapping theorem. These theorems work with the Clarke generalized Jacobian. In our journey, we will face several interesting phenomena and pitfalls in the world of (just) 2 by 3 matrices.

VA & Opt Webinar: Marco A. López-Cerdá (Alicante)

Title: Optimality conditions in convex semi-infinite optimization. An approach based on the subdifferential of the supremum function.

Speaker: Marco A. López-Cerdá (Alicante University)

Date and Time: June 24th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We present a survey on optimality conditions (of Fritz-John and KKT- type) for semi-infinite convex optimization problems. The methodology is based on the use of the subdifferential of the supremum of the infinite family of constraint functions. Our approach aims to establish weak constraint qualifications and, in the last step, to drop out the usual continuity/closedness assumptions which are standard in the literature. The material in this survey is extracted from the following papers:

R. Correa, A. Hantoute, M. A. López, Weaker conditions for subdifferential calculus of convex functions. J. Funct. Anal. 271 (2016), 1177-1212.

R. Correa, A. Hantoute, M. A. López, Moreau-Rockafellar type formulas for the subdifferential of the supremum function. SIAM J. Optim. 29 (2019), 1106-1130.

R. Correa, A. Hantoute, M. A. López, Valadier-like formulas for the supremum function II: the compactly indexed case. J. Convex Anal. 26 (2019), 299-324.

R. Correa, A. Hantoute, M. A. López, Subdifferential of the supremum via compactification of the index set. To appear in Vietnam J. Math. (2020).

VA & Opt Webinar: Michel Théra (Limoges & Fed Uni)

Title: Old and new results on equilibrium and quasi-equilibrium problems

Speaker: Michel Théra (Professeur Emérite, Université de Limoges, France and Adjunct Professor Federation University Australia)

Dates and Time: June 17th, 2020, 17:00 AEST. (Register here for remote connection via Zoom)

Abstract: In this talk I will briefly survey some old results which are going back to Ky Fan and Brezis-Niremberg and Stampacchia.  Then I will give some new results related to the existence of solutions to equilibrium and quasi- equilibrium problems without any convexity assumption. Coverage includes some equivalences to the Ekeland variational principle for bifunctions and basic facts about transfer lower continuity. An application is given to systems of quasi-equilibrium problems.

VA & Opt Webinar: Tien-Son Pham (Uni of Dalat)

Title: Openness, Hölder metric regularity and Hölder continuity properties of semialgebraic set-valued maps

Speaker: Tiến-Sơn Phạm. (Department of Mathematics, University of Dalat, Vietnam)

Date and Time: June 3rd, 2020, 17:00 AEST. (Register here for remote connection via Zoom)

Abstract: Given a semialgebraic set-valued map with closed graph, we show that it is Hölder metrically subregular and that the following conditions are equivalent:
(i) the map is an open map from its domain into its range and the range of is locally closed;
(ii) the map is Hölder metrically regular;
(iii) the inverse map is pseudo-Hölder continuous;
(iv) the inverse map is lower pseudo-Hölder continuous.
An application, via Robinson’s normal map formulation, leads to the following result in the context of semialgebraic variational inequalities: if the solution map (as a map of the parameter vector) is lower semicontinuous then the solution map is finite and pseudo-Holder continuous. In particular, we obtain a negative answer to a question mentioned in the paper of Dontchev and Rockafellar [Characterizations of strong regularity for variational inequalities over polyhedral convex sets. SIAM J. Optim., 4(4):1087–1105, 1996]. As a byproduct, we show that for a (not necessarily semialgebraic) continuous single-valued map, the openness and the non-extremality are equivalent. This fact improves the main result of Pühn [Convexity and openness with linear rate. J. Math. Anal. Appl., 227:382–395, 1998], which requires the convexity of the map in question.  

Monash Colloquium: Jon Chapman (Oxford)

Title: Asymptotics beyond all orders: the devil’s invention?

Speaker: Prof. S. Jon. Chapman (Oxford)

Date And Time: 8:30 pm – 10:00 pm AEST, Thu., 14 May 2020.

Venue: Zoom (register here for connection details)

Abstract: The lecture will introduce the concept of an asymptotic series, showing how useful divergent series can be, despite Abel’s reservations. We will then discuss Stokes’ phenomenon, whereby the coefficients in the series appear to change discontinuously. We will show how understanding Stokes phenomenon is the key which allows us to determine the qualitative and quantitative behaviour of the solution in many practical problems. Examples will be drawn from the areas of surface waves on fluids, crystal growth, dislocation dynamics, localised pattern formation, and Hele-Shaw flow.

UNSW Seminar: Matthew K. Tam (UniMelb)

Title: Splitting Algorithms for Training GANs

Speaker: Matthew Tam (University of Melbourne)

Date: Thu, 14/05/2020 – 11:05am

Venue: Zoom meeting (connection details here)

Abstract: Generative adversarial networks (GANs) are an approach to fitting generative models over complex structured spaces. Within this framework, the fitting problem is posed as a zero-sum game between two competing neural networks which are trained simultaneously. Mathematically, this problem takes the form of a saddle-point problem; a well-known example of the type of problem where the usual (stochastic) gradient descent-type approaches used for training neural networks fail. In this talk, we rectify this shortcoming by proposing a new method for training GANs that has both: (i) theoretical guarantees of convergence, and (ii) does not increase the algorithm’s per iteration complexity (as compared to gradient descent). The theoretical analysis is performed within the framework of monotone operator splitting.

1 6 7 8 9