This book presents cutting-edge contributions in the areas of control theory and partial differential equations. Over the
decades, control theory has had deep and fruitful interactions with the theory of partial differential equations (PDEs). Well-known
examples are the study of the generalized solutions of Hamilton-Jacobi-Bellman equations arising in deterministic and stochastic
optimal control and the development of modern analytical tools to study the controllability of infinite dimensional systems
governed by PDEs. In the present volume, leading experts provide an up-to-date overview of the connections between these two
vast fields of mathematics. Topics addressed include regularity of the value function associated to finite dimensional control
systems, controllability and observability for PDEs, and asymptotic analysis of multiagent systems. The book will be of interest
for both researchers and graduate students working in these areas.