Sufficiently Informative Functions and the Minimax Feedback Control of Uncertain Dynamic Systems

Dimitri P. Bertsekas, Ian B. Rhodes

Research output: Contribution to journalArticlepeer-review

54 Scopus citations

Abstract

The problem of optimal feedback control of uncertain discrete-time dynamic systems is considered where the uncertain quantities do not have a stochastic description but instead are known to belong to given sets. The problem is converted to a sequential minimax problem and dynamic programming is suggested as a general method for its solution. The notion of a sufficiently informative function, which parallels the notion of a sufficient statistic of stochastic optimal control, is introduced, and conditions under which the optimal controller decomposes into an estimator and an actuator are identified. A limited class of problems for which this decomposition simplifies the computation and implementation of the optimal controller is delineated.

Original languageEnglish (US)
Pages (from-to)117-124
Number of pages8
JournalIEEE Transactions on Automatic Control
Volume18
Issue number2
DOIs
StatePublished - Apr 1973
Externally publishedYes

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Sufficiently Informative Functions and the Minimax Feedback Control of Uncertain Dynamic Systems'. Together they form a unique fingerprint.

Cite this