Out-dated version of Internet Explorer (or in compatibility mode) is not supported. Please use Chrome , Firefox , Safari or latest version of Internet Explorer to view this website.


Dynamic continuous-time systems. Examples, modelling, and classification of optimal control problems. Pontryagin’s maximum principle: adjoint equation, Hamiltonian system, and sufficient condition of optimality. Bellman’s dynamic programming: principle of optimality, Hamilton-Jacobi-Bellman equation, and verification theorem. Linear quadratic control: Riccati equation and linear matrix inequality. Introduction to numerical methods of solving optimal control problems.

Department of Systems Engineering and Engineering Management, CUHK