VA & Opt Webinar: David Bartl

Title: Every compact convex subset of matrices is the Clarke Jacobian of some Lipschitzian mapping

Speaker: David Bartl (Silesian University in Opava)

Date and Time: March 17th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Given a non-empty compact convex subset P of m×n matrices, we show constructively that there exists a Lipschitzian mapping g : Rn → Rm such that its Clarke Jacobian ∂g(0) = P.

VA & Opt Webinar: Alexander Kruger

Title: Error bounds revisited

Speaker: Alexander Kruger (Federation University Australia)

Date and Time: March 10th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: We propose a unifying general framework of quantitative primal and dual sufficient error bound conditions covering linear and nonlinear, local and global settings. We expose the roles of the assumptions involved in the error bound assertions, in particular, on the underlying space: general metric, Banach or Asplund. Employing special collections of slope operators, we introduce a succinct form of sufficient error bound conditions, which allows one to combine in a single statement several different assertions: nonlocal and local primal space conditions in complete metric spaces, and subdifferential conditions in Banach and Asplund spaces. In the nonlinear setting, we cover both the conventional and the ‘alternative’ error bound conditions.

It is a joint work with Nguyen Duy Cuong (Federation University). The talk is based on the paper: N. D. Cuong and A. Y. Kruger, Error bounds revisited, arXiv: 2012.03941 (2020).

VA & Opt Webinar: Javier Peña

Title: The condition number of a function relative to a set

Speaker: Javier Peña (Carnegie-Mellon University)

Date and Time: March 3rd, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: The condition number of a differentiable convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is the square of the aspect ratio of a canonical ellipsoid associated to the function. Furthermore, the condition number of a function bounds the linear rate of convergence of the gradient descent algorithm for unconstrained convex minimization.

We propose a condition number of a differentiable convex function relative to a reference set and distance function pair. This relative condition number is defined as the ratio of a relative smoothness to a relative strong convexity constants. We show that the relative condition number extends the main properties of the traditional condition number both in terms of its geometric insight and in terms of its role in characterizing the linear convergence of first-order methods for constrained convex minimization.

This is joint work with David H. Gutman at Texas Tech University.

VA & Opt Webinar: Nguyen Duy Cuong

Title: Necessary conditions for transversality properties

Speaker: Nguyen Duy Cuong (Federation University)

Date and Time: February 24th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Transversality properties of collections of sets play an important role in optimization and variational analysis, e.g., as constraint qualifications, qualification conditions in subdifferential, normal cone and coderivative calculus, and convergence analysis of computational algorithms. In this talk, we present some new results on primal (geometric, metric, slope) and dual (subdifferential, normal cone) necessary (in some cases also sufficient) conditions for transversality properties in both linear and nonlinear settings. Quantitative relations between transversality properties and the corresponding regularity properties of set-valued mappings are also discussed.

VA & Opt Webinar: Alexander J. Zaslavski

Title: Subgradient Projection Algorithm with Computational Errors

Speaker: Alexander J. Zaslavski (The Technion – Israel Institute of Technology)

Date and Time: February 17th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: We study the subgradient projection algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. We show that our algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how many iterates one needs for this.

VA & Opt Webinar: Nam Ho-Nguyen

Title: Coordinate Descent Without Coordinates: Tangent Subspace Descent on Riemannian Manifolds

Speaker: Nam Ho-Nguyen (University of Sydney)

Date and Time: February 10th, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: We consider an extension of the coordinate descent algorithm to manifold domains, and provide convergence analyses for geodesically convex and non-convex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of TSD is the appropriate choice of subspace at each iteration. To this end, we propose two novel conditions: the gap ensuring and C-randomized norm conditions on deterministic and randomized modes of subspace selection respectively. These ensure convergence for smooth functions, and are satisfied in practical contexts. We propose two subspace selection rules of particular practical interest that satisfy these conditions: a deterministic one for the manifold of square orthogonal matrices, and a randomized one for the more general Stiefel manifold. (This is joint work with David Huckleberry Gutman, Texas Tech University.)

Seminar Series on Computational Mathematics

Dear MoCaO members.

We want to inform you about the launch of the Australian Seminar on Computational Mathematics.

You can visit the seminar webpage (main menu, CM Webinar)

where you can find the scheduled talks details and instructions about

1) livestream access to the talk and

2) subscription to the mailing list that will be used to announce new talks.

The inaugural talk will be Tue Feb 09 5pm AEDT, and it will be delivered by Alexandre Ern (Université Paris-Est, CERMICS , ENPC).

VA & Opt Webinar: Ernest Ryu

Title: Scaled Relative Graph: Nonexpansive operators via 2D Euclidean Geometry

Speaker: Ernest Ryu (Seoul National University)

Date and Time: November 25th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Many iterative methods in applied mathematics can be thought of as fixed-point iterations, and such algorithms are usually analyzed analytically, with inequalities. In this work, we present a geometric approach to analyzing contractive and nonexpansive fixed point iterations with a new tool called the scaled relative graph (SRG). The SRG provides a rigorous correspondence between nonlinear operators and subsets of the 2D plane. Under this framework, a geometric argument in the 2D plane becomes a rigorous proof of contractiveness of the corresponding operator.

VA & Opt Webinar: Aram Arutyunov & S.E. Zhukovskiy

Title: Local and Global Inverse and Implicit Function Theorems

Speaker: Aram Arutyunov (Moscow State University) & S.E. Zhukovskiy (V. A. Trapeznikov Institute of Control Sciences of RAS)

Date and Time: November 18th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In the talk, we present a local inverse function theorem on a cone in a neighbourhood of abnormal point. We present a global inverse function theorem in the form of theorem on trivial bundle, guaranteeing that if a smooth mapping of finite-dimensional spaces is uniformly nonsingular, then it has a smooth right inverse satisfying a priori estimate. We also present a global implicit function theorem guaranteeing the existence and continuity of a global implicit function under the condition that the mappings in question are uniformly nonsingular. The generalization of these results to the case of mappings of Hilbert spaces and Banach spaces are discussed.

VA & Opt Webinar: Vinesha Peiris

Title: The extension of linear inequality method for generalised rational Chebyshev approximation

Speaker: Vinesha Peiris (Swinburne)

Date and Time: November 11th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In this talk we will demonstrate the correspondence between the linear inequality method developed for rational Chebyshev approximation and the bisection method used in quasiconvex optimisation. It naturally connects rational and generalised rational Chebyshev approximation problems with modern developments in the area of quasiconvex functions. Moreover, the linear inequality method can be extended to a broader class of Chebyshev approximation problems, where the corresponding objective functions remain quasiconvex.

1 4 5 6 7 8 10