Examples of optimal control for partially observable systems:comparison, classical, and martingale methods
- 1 June 1981
- journal article
- research article
- Published by Taylor & Francis in Stochastics
- Vol. 5 (1-2) , 43-64
- https://doi.org/10.1080/17442508108833173
Abstract
The following kind of stochastic control problem is considerd: to minimize independent Wiener processes, X 0 Gaussian random variable independent of observed, with all causal functions u t bounded by unity admissible. By using recent comparison theorems for solutions of stochastic differential equations, it is possible to prove that the physically obvious law: is indeed optimal. The same result is established via the classical and the martingale methods of approach to stochastic control. Various generalizations of the above model are also discussed.Keywords
This publication has 9 references indexed in Scilit:
- On “predicted miss” stochastic control problemsStochastics, 1979
- The Separation Principle in Stochastic Control via Girsanov SolutionsSIAM Journal on Control and Optimization, 1976
- Composition and invariance methods for solving some stochastic control problemsAdvances in Applied Probability, 1975
- Girsanov functionals and optimal bang-bang laws for final value stochastic controlStochastic Processes and their Applications, 1974
- Information states for linear stochastic systemsJournal of Mathematical Analysis and Applications, 1972
- Capacités et processus stochastiquesPublished by Springer Nature ,1972
- Some Extensions of the Innovations Theorem*Bell System Technical Journal, 1971
- On the Separation Theorem of Stochastic ControlSIAM Journal on Control, 1968
- On Transforming a Certain Class of Stochastic Processes by Absolutely Continuous Substitution of MeasuresTheory of Probability and Its Applications, 1960