Discerete and Continuous Optimization

Half of the course (the Discrete part) is taught by Prof. Martin Loebl

Organization

During the Summer Semester 2022-2023 the lectures are scheduled on Fridays at 09:00 in S4 in Mala Strana. Tutorials are held immediately afterwards at 10:40 in S7

Syllabus

We will follow these lecture notes by Prof. Milan Hladik.

Material Covered in the Lectures


March 31: Optimization problems: discrete vs. continuous; Example: network flow, maximum and minimum eigenvalues via Rayleigh-Ritz theorem; Continuous optimization: (global/local) minimum, strict (global/local) minimum; Weierstrass' theorem; Basic transformations: max <-> min, modifying objective/inequalities/equalities using increasing/non-negativity preserving/root-preserving functions resp., moving objective function to constraints, elimination of equations & variables.

Recommended reading: Chapter 1.


April 14: First order necessary optimality condition; Second order necessary optimality condition; second order sufficient optimality condition; Example: least squares; Convexity: convex sets, separation theorem for convex sets, convex functions.

Recommended reading: Chapter 2, Chapter 3 (until page 22).


April 21: No lecture.


April 28: Convexity; Convex functions: Jensen's inequality; characterizing convex functions via convexity of epigraphs; first and second order characterization of convex functions; others rules for convexity detection: addition, product, composition, etc; Convex optimization; basic properties: local minimimum is global minimum, optimal solution set is convex, strictly convex functions have no or unique minimum; first order neccessary optimality conditions are sufficient for optimality of convex programs; examples; quadratic programming.

Recommended reading: Chapter 3, Sections 4.1, 4.2.


May 5: NP-hardness of Quadratic programming; Convex cone programming: duality; second order cone programming; LP, SDP as special cases of second order cone programming; remarks on computational complexity: ellipsoid algorithm for convex programming, NP-hardness of copositive programming;

Recommended reading: Sections 4.3, 4.4.


May 12: KKT conditions; Methods: line search, Newton method, gradient methods, active-set method, penalty and barrier methods.

Recommended reading: Chapter 5, Chapter 6.


May 19: Robust optimization: interval uncertainty for LPs with nonnegativity, interval uncertainty for general LPs; Remarks on more ellipsoidal uncertainty and concave programming.

Recommended reading: Chapter 7.
End of Lectures.