POSTDOCTORAL RESEARCH FELLOW (4 POSITIONS), University of Melbourne

Job no: 0051226

Work type: Fixed Term

Location: Parkville

Division/Faculty: Faculty of Science

Department/School: School of Mathematics and Statistics

Role & Superannuation rate: Academic – Full time – 9.5% super

Salary: Level A: $73,669 – $99,964 p.a (PhD entry level $93,120) or Level B: $105,232 – $124,958 p.. Level of appointment is subject to applicants’ qualifications and experience. 

The School of Mathematics and Statistics at the University of Melbourne has four two-year research positions available for exceptional early career researchers in the mathematical sciences, whose research has the potential to have a significant impact, either in fundamental research or toward practical applications. The School maintains activity in all areas of the mathematical sciences, and the positions can be related to any area.

For more details please refer to

http://jobs.unimelb.edu.au/caw/en/job/903384/postdoctoral-research-fellow-4-positions

VA & Opt Webinar: Vinesha Peiris

Title: The extension of linear inequality method for generalised rational Chebyshev approximation

Speaker: Vinesha Peiris (Swinburne)

Date and Time: November 11th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In this talk we will demonstrate the correspondence between the linear inequality method developed for rational Chebyshev approximation and the bisection method used in quasiconvex optimisation. It naturally connects rational and generalised rational Chebyshev approximation problems with modern developments in the area of quasiconvex functions. Moreover, the linear inequality method can be extended to a broader class of Chebyshev approximation problems, where the corresponding objective functions remain quasiconvex.

ARC PhD Scholarship ($31,885pa): Switching Dynamics Approach for Distributed Global Optimisation

ARC PhD Scholarship ($31,885pa): Switching Dynamics Approach for Distributed Global Optimisation

https://www.rmit.edu.au/students/student-essentials/information-for/research-candidates/enriching-your-candidature/grants-and-scholarships/postgraduate-by-research/switching-dynamics-approach-distributed-global-optimisation

Fast growing big-data in industrial systems makes finding optimal solutions for timely decision making more difficult. This project aims to create a breakthrough switching dynamics approach and new technology to speed up finding optimal solutions. It will develop a distributed switching dynamics based optimisation scheme for global optimisation problems in big-data environments, hence resulting in a practical technology for industry applications (e.g. smart grids).

The specific objectives of this Project are:

1. Establish a switching dynamics approach for global optimisation, forming the foundation to accelerate convergence to search for optimal solutions.

2. Create an intelligent distributed global optimisation scheme with switching dynamics based multi-agent system concepts, which is scalable to big-data optimisation tasks.

This is a project funded by an Australian Research Council (ARC) Discovery Grant for three years (2021-2023), which aims to develop a breakthrough switching dynamics approach and new technology for global optimisation tasks in big-data applications.

The successful applicant will work on this project for the PhD in the School of Science at RMIT University supervised by Prof. Andrew Eberhard and carried out in collaboration with Prof. Xinghuo Yu (Electrical Engineering) at RMIT.

Qualifications

You are required to have a Bachelor degree in a relevant discipline such as Mathematical Sciences or Electrical Engineering with at least 2nd class upper honours or equivalent.  Experience in one or more areas in Nonlinear Dynamical Systems, Discontinuous Control Systems, Optimisation theory and\or Optimisation Algorithms is desirable. The applicant must have a strong background in mathematics.

Application

A CV detailing your qualifications, research experience and achievements, a statement of your suitability to this project, and contact details of two referees are to be emailed to Professor Andrew Eberhard at andy.eberhard@rmit.edu.au.  For further information, please contact Prof. Andrew Eberhard directly.

The Euler International Mathematical Institute in St. Petersburg is seeking postdocs in all areas of Mathematics, Theoretical Computer Science, Mathematical and Theoretical Physics.

Call for Postdocs 2020

The Euler International Mathematical Institute in St. Petersburg is seeking postdocs in all areas of Mathematics, Theoretical Computer Science, Mathematical and Theoretical Physics.

Applicants should send their applications to euler.postdoc@gmail.com

The applications should include:

  • CV,
  • List of publications (including preprints, if necessary),
  • Description of research interests, ideally mentioning possible host or other research contacts in St.Petersburg,
  • Names, affiliations and contacts of 2-3 people willing to send recommendation letters if asked by the committee,
  • Any special requirements wrt the dates, etc.

Basic conditions:

  • Competitive salary of 126,314 RUB per month (taxed at 13% for residents and foreigners), this is double the average salary in St. Petersburg,
  • Housing allowance enough to cover all or most of the rent (in addition to the salary),
  • The institute partially covers travelling expenses to St. Petersburg of up to 300 Euro for the postdocs from Europe and up to 600 Euro for the postdocs outside Europe,
  • The institute has some funds for covering participation in conferences that cannot be covered from other sources,
  • 1 or 2 years extendable for another year,
  • Small teaching load,
  • Flexibility with respect to the starting date, length, specific calendar requirements (such as a leave in the middle),

St. Petersburg is the most beautiful city in the world and has multiple mathematical locations including Steklov Institute of Mathematics http://www.pdmi.ras.ru/pdmi/en/laboratories and the newly created Department of Mathematics and Computer Science in St. Petersburg State University http://math-cs.spbu.ru/en/people/ (the links are provided also as a “menu” of possible hosts).

The preference is given to applications completed before November 30, 2020. Preferable starting date is September 1, 2021.

If you have questions, please do not hesitate to ask them by email.

VA & Opt Webinar: Chayne Planiden (UoW)

Title: New Gradient and Hessian Approximation Methods for Derivative-free Optimisation

Speaker: Chayne Planiden (UoW)

Date and Time: November 4th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In general, derivative-free optimisation (DFO) uses approximations of first- and second-order information in minimisation algorithms. DFO is found in direct-search, model-based, trust-region and other mainstream optimisation techniques and is gaining popularity in recent years. This work discusses previous results on some particular uses of DFO: the proximal bundle method and the VU-algorithm, and then presents improvements made this year on the gradient and Hessian approximation techniques. These improvements can be inserted into any routine that requires such estimations.

VA & Opt Webinar: Radek Cibulka (University of West Bohemia)

Title: Continuous selections for inverse mappings in Banach spaces

Speaker: Radek Cibulka (University of West Bohemia)

Date and Time: October 28th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Influenced by a recent work by A. V. Arutyunov, A. F. Izmailov,  and S. E. Zhukovskiy,  we establish a general Ioffe-type criterion guaranteeing the existence of a continuous and calm selection for the inverse of a single-valued uniformly continuous mapping between Banach spaces with  a closed domain.  We show that the general statement yields elegant proofs  following  the same pattern as in the case of the usual openness with a linear rate by  considering mappings instead of points. As in the case of the Ioffe’s criterion for linear openness around the reference point, this allows us to avoid the iteration, that is, the construction of a sequence of continuous functions  the limit of which is the desired continuous selection for the inverse mapping, which is illustrated on the proof of the Bartle-Graves theorem.  Then we formulate sufficient conditions based on approximations given by positively homogeneous mappings and bunches of linear operators. The talk is based on a joint work with Marián Fabian.

VA & Opt Webinar: Wilfredo Sosa (UCB)

Title:On diametrically maximal sets, maximal premonotone maps and promonote bifunctions

Speaker: Wilfredo Sosa (UCB)

Date and Time: October 21st, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: First, we study diametrically maximal sets in the Euclidean space (those which are not properly contained in a set with the same diameter), establishing their main properties. Then, we use these sets for exhibiting an explicit family of maximal premonotone operators. We also establish some relevant properties of maximal premonotone operators, like their local boundedness, and finally we introduce the notion premonotone bifunctions, presenting a canonical relation between premonotone operators and bifunctions, that extends the well known one, which holds in the monotone case.

VA & Opt Webinar: Björn Rüffer (UoN)

Title: A Lyapunov perspective to projection algorithms

Speaker: Björn Rüffer (UoN)

Date and Time: October 14th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: The operator theoretic point of view has been very successful in the study of iterative splitting methods under a unified framework. These algorithms include the Method of Alternating Projections as well as the Douglas-Rachford Algorithm, which is dual to the Alternating Direction Method of Multipliers, and they allow nice geometric interpretations. While convergence results for these algorithms have been known for decades when problems are convex, for non-convex problems progress on convergence results has significantly increased once arguments based on Lyapunov functions were used. In this talk we give an overview of the underlying techniques in Lyapunov’s direct method and look at convergence of iterative projection methods through this lens.