Control with stochastic stopping time†
- 1 April 1970
- journal article
- research article
- Published by Taylor & Francis in International Journal of Control
- Vol. 11 (4) , 541-549
- https://doi.org/10.1080/00207177008905936
Abstract
The usual (non-stochastic stopping) control problem is extended to the case of random terminal time. The more general model presented hero should be particularly useful when a system will change while being controlled. Systems which are otherwise deterministic, and systems with additive noise in the dynamic equation are considered. Results concerning the relevant aspects of reliability and Markov process theories are presented. We show that stochastically stopped control optimality conditions are simply extensions of the usual conditions and that the limits of the criteria and optimal control, as the variance of the stopping probability distribution approaches zero, are the corresponding quantities for the non-stochastic problem.Keywords
This publication has 7 references indexed in Scilit:
- The optimal control of linear systems with unknown final timeIEEE Transactions on Automatic Control, 1966
- Markov ProcessesPublished by Springer Nature ,1965
- Discrete-time interrupted stochastic control processesJournal of Mathematical Analysis and Applications, 1962
- Optimal stochastic controlIRE Transactions on Automatic Control, 1962
- Remark on the paper by Bellman and KalabaInformation and Control, 1961
- A note on interrupted stochastic control processesInformation and Control, 1961
- Optimal Control of Continuous Time, Markov, Stochastic Systems†Journal of Electronics and Control, 1961