Optimal Control System

Optimal Control System

 

an automatic control system that ensures functioning of the object of control that is the best, or optimal, from a particular point of view. The characteristics of the object, and also the external disturbing influences, may change in an unforeseen manner but usually remain within certain limits.

The optimal functioning of a control system is described by the criterion of optimal control, also called the criterion of optimality or the target function, which is a quantity that defines the efficiency of achieving the goal of control and depends on the change in time or space of the coordinates and parameters of the system. Various technical and economic indexes of the functioning of the object may be the criterion of optimality; among them are efficiency, speed of operation, average or maximum deviation of system parameters from assigned values, prime cost of the product, and certain indexes of product quality or a generalized quality index. The criterion of optimality may apply to a transient process, a stable process, or both.

A distinction is made between regular and statistical criteria of optimality. Regular criteria depend on regular parameters and on the coordinates of the controlled and controlling systems. Statistical criteria are used when the input signals are random functions and/or when random disturbances generated by certain elements of the system must be taken into account. In terms of a mathematical description, the criterion of optimality may be either a function of a finite number of parameters and coordinates of the controlled process, which assumes an extreme value when the system is functioning optimally, or a functional of the function that describes the control rule; in this case, the form of the function for which the functional assumes an extreme value is determined. Pontriagin’s maximum principle or the theory of dynamic programming is used to calculate an optimal system.

Optimal functioning of complex objects is achieved by using adaptive control systems, which, while functioning, are capable of automatically changing their control algorithms, characteristics, or structure to maintain a constant criterion of optimality with randomly changing parameters and conditions of operation of the system. In the general case, therefore, an optimal system consists of two parts: the constant (invariable) part, which includes the object of control and certain elements of the control system, and the variable part, which includes the other elements.

M. M. MAIZEL