Dynamic continuous-time systems. Examples, modelling, and classification of optimal control problems. Pontryagin’s maximum principle: adjoint equation, Hamiltonian system, and sufficient condition of optimality. Bellman’s dynamic programming: principle of optimality, Hamilton-Jacobi-Bellman equation, and verification theorem. Linear quadratic control: Riccati equation and linear matrix inequality. Introduction to numerical methods of solving optimal control problems.