Syllabus
Problem formulation – Performance measures – Selection of performance measures – Dynamics programming – Optimal control law – Application to a routing problem – Recurrence relations – Computational procedures – Alternative approach through Hamiltonial-Jacobi-Bellman equation – Review of Calculus of Variations: Functionals involving several independent functions – Constrained minimization of functional – Optimal control: Variational approach – Necessary condition for opti- mal control – Pontryagin’s minimum principle – Additional necessary conditions – Minimum time problems – Optimal control switches (bangbang control) – Numerical techniques for the solution of optimal control problem – Two point boundary value problem.
Text Books
Same as Reference
References
1. Kirk, D. E., Optimal Control Theory: An Introduction, Dover (1998).
2. Bryson Jr., A. E. and Ho, Y.-C., Applied Optimal Control: Optimization, Estimation, and Control, Taylor & Francis (1975).
3. Subchan, S. and Zbikowski, R., Computational Optimal Control: Tools and Practice, Wiley (2009).
4. Naidu, D. S., Optimal Control Systems, CRC Press (2002).