This work presents recent mathematical methods in the area of optimal control with a particular emphasis on the computational
aspects and applications. Optimal control theory concerns the determination of control strategies for complex dynamical systems,
in order to optimize some measure of their performance. Started in the 60's under the pressure of the "space race" between
the US and the former USSR, the field now has a far wider scope, and embraces a variety of areas ranging from process control
to traffic flow optimization, renewable resources exploitation and management of financial markets. These emerging applications
require more and more efficient numerical methods for their solution, a very difficult task due the huge number of variables.
The chapters of this volume give an up-to-date presentation of several recent methods in this area including fast dynamic
programming algorithms, model predictive control and max-plus techniques. This book is addressed to researchers, graduate
students and applied scientists working in the area of control problems, differential games and their applications.