VA & OPT Webinar: Shawn Wang

Title: Roots of the identity operator and proximal mappings: (classical and phantom) cycles and gap vectors

Speaker: Shawn Wang (The University of British Columbia)

Date and Time: Wed Mar 23 2022, 11:00 AEST (Register here for remote connection via Zoom)

Abstract:

Recently, Simons provided a lemma for a support function of a closed convex set in a general Hilbert space and used it to prove the geometry conjecture on cycles of projections. We extend Simons’s lemma to closed convex functions, show its connections to Attouch-Théra duality, and use it to characterize (classical and phantom) cycles and gap vectors of proximal mappings. Joint work with H. Bauschke

VA & Opt Webinar: Janosch Rieger

Title: Generalized Gearhart-Koshy acceleration for the Kaczmarz method

Speaker: Janosch Rieger (Monash University)

Date and Time: Wed Mar 16 2022, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

The Kaczmarz method is an iterative numerical method for solving large and sparse rectangular systems of linear equations. Gearhart and Koshy have developed an acceleration technique for the Kaczmarz method for homogeneous linear systems that minimises the distance to the desired solution in the direction of a full Kaczmarz step. Matthew Tam has recently generalised this acceleration technique to inhomogeneous linear systems.

In this talk, I will develop this technique into an acceleration scheme that minimises the Euclidean norm error over an affine subspace spanned by a number of previous iterates and one additional cycle of the Kaczmarz method. The key challenge is to find a formulation in which all parameters of the least-squares problem defining the unique minimizer are known, and to solve this problem efficiently.

VA & Opt Webinar: David Yost

Title: Minimising the number of faces of a class of polytopes

Speaker: David Yost (Federation University Australia)

Date and Time: Wed Dec 1, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

Polytopes are the natural domains of many optimisation problems. We consider a “higher order” optimisation problem, whose domain is a class of polytopes, asking what is the minimum number of faces (of a given dimension) for this class, and which polytopes are the minimisers. Generally we consider the class of d-dimensional polytopes with V vertices, for fixed V and d. The corresponding maximisation problem was solved decades ago, but serious progress on the minimisation question has only been made in recent years auxiliary information will be provided.

VA & Opt Webinar: Fred Roosta-Khorasani

Title: A Newton-MR Algorithm with Complexity Guarantee for Non-Convex Problemsverting exhausters and coexhausters

Speaker: Fred Roosta-Khorasani (The University of Queensland)

Date and Time: Wed Dec 1, 11:00 AEST (Register here for remote connection via Zoom)

Abstract:

Classically, the conjugate gradient (CG) method has been the dominant solver in most inexact Newton-type methods for unconstrained optimization. In this talk, we consider replacing CG with the minimum residual method (MINRES), which is often used for symmetric but possibly indefinite linear systems. We show that MINRES has an inherent ability to detect negative-curvature directions. Equipped with this advantage, we discuss algorithms, under the general name of Newton-MR, which can be used for optimization of general non-convex objectives, and that come with favourable complexity guarantees. We also give numerical examples demonstrating the performance of these methods for large-scale non-convex machine learning problems.

VA & Opt Webinar: Majid Abbasov

Title: Converting exhausters and coexhausters

Speaker: Majid Abbasov (Saint-Petersburg State University)

Date and Time: Wed Nov 17, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

Exhausters and coexhausters are notions of constructive nonsmooth analysis which are used to study extremal properties of functions. An upper exhauster (coexhauster) is used to get an approximation of a considered function in the neighborhood of a point in the form of minmax of linear (affine) functions. A lower exhauster (coexhauster) is used to represent the approximation in the form of maxmin of linear (affine) functions. Conditions for a minimum in a most simple way are expressed by means of upper exhausters and coexhausters, while conditions for a maximum are described in terms of lower exhausters and coexhausters. Thus the problem of obtaining an upper exhauster or coexhauster when the lower one is given and vice verse arises. In the talk I will consider this problem and present new method for such a . Also all needed auxiliary information will be provided.

VA & Opt Webinar: Jane Ye

Title: Difference of convex algorithms for bilevel programs with applications in hyperparameter selection

Speaker: Jane Ye (University of Victoria, Canada)

Date and Time: Wed Nov 10, 11:00 AEST (Register here for remote connection via Zoom)

Abstract:

A bilevel program is a sequence of two optimization problems where the constraint region of the upper level problem is determined implicitly by the solution set to the lower level problem. In this talk, I will present difference of convex algorithms for solving bilevel programs in which the upper level objective functions are difference of convex functions and the lower level programs are fully convex. This nontrivial class of bilevel programs provides a powerful modelling framework for dealing with  applications arising from hyperparameter selection in machine learning. Thanks to the full convexity of the lower level program,  the value function of the lower level program turns out to be convex and hence the bilevel program can be reformulated as a difference of convex bilevel program. We propose two algorithms for solving the reformulated difference of convex program and show their convergence to stationary points under very mild assumptions. Finally we conduct numerical experiments to a bilevel model of support vector machine classification.

VA & Opt Webinar: Nadezda Sukhorukova

Title: Rational approximation and its role in different branches of mathematics and applications

Speaker: Nadezda Sukhorukova (Swinburne University of Technology)

Date and Time: Wed Nov 3, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

Rational approximation is a powerful function approximation tool. Rational approximation is approximation by a ratio of two polynomials, whose coefficients are subject to optimisation. Numerical methods for rational approximation have been developed independently in different branches of mathematics. In this talk, I will present the interconnections between different numerical methods developed to rational approximation. Most of them can be extended to the case of the so called generalised rational approximation where the approximation is a ration of two linear forms and the basis functions are not limited to monomials. Finally, I am going to talk about real-life applications for rational and generalised rational approximation.queness.

VA & Opt Webinar: Nghia Tran

Title: Sharp and strong minima for robust recovery

Speaker: Nghia Tran (Oakland University)

Date and Time: Wed Oct 27, 11:00 AEST (Register here for remote connection via Zoom)

Abstract:

In this talk, we show the important roles of sharp minima and strong minima for robust recovery. We also obtain several characterizations of sharp minima for convex regularized optimization problems. Our characterizations are quantitative and verifiable especially for the case of decomposable norm regularized problems including sparsity, group-sparsity, and low-rank convex problems. For group-sparsity optimization problems, we show that a unique solution is a strong solution and obtain quantitative characterizations for solution uniqueness.

VA & Opt Webinar: Sidney Morris

Title: Tweaking Ramanujan’s Approximation of n!

Speaker: Sidney Morris (Federation University, and La Trobe University)

Date and Time: Wed Oct 20, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

In 1730 James Stirling, building on the work of Abraham de Moivre, published what is known as Stirling’s approximation of n!. He gave a good formula which is asymptotic to n!. Since then hundreds of papers have given alternative proofs of his result and improved upon it, including notably by Burside, Gosper, and Mortici. However Srinivasa Ramanujan gave a remarkably better asymptotic formula. Hirschhorn and Villarino gave a nice proof of Ramanujan’s result and an error estimate for the approximation. 

This century there have been several improvements of Stirling’s formula including by Nemes, Windschitl, and Chen. In this presentation it is shown 

(i) how all these asymptotic results can be easily verified; 

(ii) how Hirschhorn and Villarino’s argument allows a tweaking of Ramanujan’s result to give a better approximation; 

(iii) that a new asymptotic formula can be obtained by further tweaking of Ramanujan’s result;

(iv) that Chen’s asymptotic formula is better than the others mentioned here, and the new asymptotic formula is comparable with Chen’s.

VA & Opt Webinar: Rubén Campoy

Title: A product space reformulation with reduced dimension

Speaker: Rubén Campoy (University of Valencia)

Date and Time: Wed Oct 13, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

The product space reformulation is a powerful trick when tackling monotone inclusions defined by finitely many operators with splitting algorithms. This technique constructs an equivalent two-operator problem, embedded in a product Hilbert space, that preserves computational tractability. Each operator in the original problem requires one dimension in the product space. In this talk, we propose a new reformulation with a reduction on the dimension of the outcoming product Hilbert space. We shall discuss the case of not necessarily convex feasibility problems. As an application, we obtain a new parallel variant of the Douglas-Rachford algorithm with a reduction in the number of variables. The computational advantage is illustrated through some numerical experiments.

1 2 3 4 5 10