F10MM - Optimisation
Course leader(s):
Aims
The main aim of this course is to present different methods for solving optimisation problems. We will study numerical methods for unconstrained and constrained optimisation problems, convex and nonconvex optimisation, smooth optimisation, and linear programming. This will include both theory (convergence theorems) and implementation (Python programming).
Syllabus
1. Unconstrained optimisation (1.1 Existence and uniqueness of minimisers, 1.2 Optimality conditions, 1.3 Line search methods)
2. Constrained optimisation (2.1 The Lagrange Multiplier Theorem, 2.2 The Karush-Kuhn-Tucker Theorem)
3. Linear programming (3.1 Standard form, 3.2 Optimality conditions, 3.3 Duality, 3.4 Fundamental Theorem of Linear Programming, 3.5 The simplex method)
Learning outcomes
By the end of the course, students should be able to do the following:
- verify whether given sets and functions are convex.
- verify whether a given point is a local or global minimiser of an unconstrained optimisation problem using first- and second-order necessary and sufficient optimality conditions.
- solve unconstrained optimisation problems using line search methods such as the steepest descent method, quasi-Newton methods, and Newton’s method.
- write down and solve the Lagrange Multiplier equations and the Karush-Kuhn-Tucker conditions for constrained optimisation problems.
- write down a linear programming problem in standard form and find its dual.
- write down and solve the optimality conditions for primal and dual linear programming problems.
- prove basic theorems in optimisation and demonstrate an understanding of the convergence theory of optimisation algorithms.
- solve optimisation problems using the programming language Python.
Further details
Curriculum explorer: Click here
SCQF Level: 10
Credits: 15