VA & Opt Webinar: David Yost

Title: Minimising the number of faces of a class of polytopes

Speaker: David Yost (Federation University Australia)

Date and Time: Wed Dec 1, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

Polytopes are the natural domains of many optimisation problems. We consider a “higher order” optimisation problem, whose domain is a class of polytopes, asking what is the minimum number of faces (of a given dimension) for this class, and which polytopes are the minimisers. Generally we consider the class of d-dimensional polytopes with V vertices, for fixed V and d. The corresponding maximisation problem was solved decades ago, but serious progress on the minimisation question has only been made in recent years auxiliary information will be provided.

VA & Opt Webinar: Fred Roosta-Khorasani

Title: A Newton-MR Algorithm with Complexity Guarantee for Non-Convex Problemsverting exhausters and coexhausters

Speaker: Fred Roosta-Khorasani (The University of Queensland)

Date and Time: Wed Dec 1, 11:00 AEST (Register here for remote connection via Zoom)

Abstract:

Classically, the conjugate gradient (CG) method has been the dominant solver in most inexact Newton-type methods for unconstrained optimization. In this talk, we consider replacing CG with the minimum residual method (MINRES), which is often used for symmetric but possibly indefinite linear systems. We show that MINRES has an inherent ability to detect negative-curvature directions. Equipped with this advantage, we discuss algorithms, under the general name of Newton-MR, which can be used for optimization of general non-convex objectives, and that come with favourable complexity guarantees. We also give numerical examples demonstrating the performance of these methods for large-scale non-convex machine learning problems.

Open position as Researcher in Optimization at KTH, Sweden.

The project is focusing on Optimization for Smart and Sustainable Power Systems.

We are searching for a Researcher in Optimization for a project on Optimization for Smart and Sustainable Power Systems. The project is a collaboration between the departments of Mathematics and Electrical Engineering. The project requires a PhD degree in a suitable field!

It’s a great opportunity to develop and apply advanced optimization techniques to important and challenging real-world applications, and to develop smart energy solutions.

More information and how to apply:

https://www.kth.se/en/om/work-at-kth/lediga-jobb/what:job/jobID:452068

Four positions in mathematics that are now on the UniSA website

Applications close at 11:30pm on Monday 6 December, with interviews the week of Monday 13 December.

3 x continuing Teaching Research Lecturers (Academic Level B) in one or more of applied statistics, applied optimisation, classical applied mathematical modelling.

More here

1 x 12-month Teaching Focussed Lecturer (Academic Level A or B). Note that Level As don’t need a PhD to apply.

More here

VA & Opt Webinar: Majid Abbasov

Title: Converting exhausters and coexhausters

Speaker: Majid Abbasov (Saint-Petersburg State University)

Date and Time: Wed Nov 17, 17:00 AEST (Register here for remote connection via Zoom)

Abstract:

Exhausters and coexhausters are notions of constructive nonsmooth analysis which are used to study extremal properties of functions. An upper exhauster (coexhauster) is used to get an approximation of a considered function in the neighborhood of a point in the form of minmax of linear (affine) functions. A lower exhauster (coexhauster) is used to represent the approximation in the form of maxmin of linear (affine) functions. Conditions for a minimum in a most simple way are expressed by means of upper exhausters and coexhausters, while conditions for a maximum are described in terms of lower exhausters and coexhausters. Thus the problem of obtaining an upper exhauster or coexhauster when the lower one is given and vice verse arises. In the talk I will consider this problem and present new method for such a . Also all needed auxiliary information will be provided.

VA & Opt Webinar: Jane Ye

Title: Difference of convex algorithms for bilevel programs with applications in hyperparameter selection

Speaker: Jane Ye (University of Victoria, Canada)

Date and Time: Wed Nov 10, 11:00 AEST (Register here for remote connection via Zoom)

Abstract:

A bilevel program is a sequence of two optimization problems where the constraint region of the upper level problem is determined implicitly by the solution set to the lower level problem. In this talk, I will present difference of convex algorithms for solving bilevel programs in which the upper level objective functions are difference of convex functions and the lower level programs are fully convex. This nontrivial class of bilevel programs provides a powerful modelling framework for dealing with  applications arising from hyperparameter selection in machine learning. Thanks to the full convexity of the lower level program,  the value function of the lower level program turns out to be convex and hence the bilevel program can be reformulated as a difference of convex bilevel program. We propose two algorithms for solving the reformulated difference of convex program and show their convergence to stationary points under very mild assumptions. Finally we conduct numerical experiments to a bilevel model of support vector machine classification.