Stochastic Optimal Control with Noisy Observations †
- 1 November 1966
- journal article
- research article
- Published by Taylor & Francis in International Journal of Control
- Vol. 4 (5) , 455-464
- https://doi.org/10.1080/00207176608921439
Abstract
The present paper is the sequel to a previous paper by the author. The main purpose of the present paper is to justify the claim made curlier that the approach to stochastic optimal control previously employed can be extended to the case of noisy observations of the state. The partial differential equation, for the conditional probability density of the present state given the past history of the noisy observations, is discussed. Using this equation, it is shown that the stochastic optimal control problem can be viewed as a problem in the theory of control of distributed parameter systems. Dynamic programming is then applied to this distributed parameter problem, to obtain a stochastic Hamilton-Jacobi equation in function space.Keywords
This publication has 4 references indexed in Scilit:
- A PrioriOpen Loop Optimal Control of Continuous Time Stochastic Systems†International Journal of Control, 1966
- Nonlinear filtering theoryIEEE Transactions on Automatic Control, 1965
- On the dynamical equations of conditional probability density functions, with applications to optimal stochastic control theoryJournal of Mathematical Analysis and Applications, 1964
- Conditional Markov ProcessesTheory of Probability and Its Applications, 1960