optimal control theory


optimal control theory

[′äp·tə·məl kən′trōl ‚thē·ə·rē] (control systems) An extension of the calculus of variations for dynamic systems with one independent variable, usually time, in which control (input) variables are determined to maximize (or minimize) some measure of the performance (output) of a system while satisfying specified constraints.