An Example of Optimal Adaptive Control†
- 1 May 1964
- journal article
- control section
- Published by Taylor & Francis in Journal of Electronics and Control
- Vol. 16 (5) , 557-567
- https://doi.org/10.1080/00207216408937668
Abstract
The problem of optimal control is considered for the system ˙x=Ax + Bu; x(0)= c, where A is subject to random change, u is restricted to a closed, bounded region U, and a quadratic performance index is used. A general formulation of the problem is given and results are obtained for a particular, simple behaviour of A The optimal system shows adaptive behaviour of an elementary kind, and uses the whole of the available control effort at each instant. It is not known whether the last property would persist for less simple behaviour of A, or for systems disturbed by noise.Keywords
This publication has 1 reference indexed in Scilit:
- Optimal Control of Continuous Time, Markov, Stochastic Systems†Journal of Electronics and Control, 1961