Optimal Control of Stochastic Integrals and Hamilton–Jacobi–Bellman Equations. I
- 1 January 1982
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Control and Optimization
- Vol. 20 (1) , 58-81
- https://doi.org/10.1137/0320006
Abstract
We consider the solution of a stochastic integral control problem, and we study its regularity. In particular, we characterize the optimal cost as the maximum solution of /v V, A(v)u <=f(v) in '(),Keywords
This publication has 19 references indexed in Scilit:
- Controlled Diffusion ProcessesPublished by Springer Nature ,1980
- A variational inequality approach to the Bellman-Dirichlet equation for two elliptic operatorsArchive for Rational Mechanics and Analysis, 1979
- Optimal stochastic switching and the Dirichlet problem for the Bellman equationTransactions of the American Mathematical Society, 1979
- Méthodes de contrôle optimal en analyse complexe. I. Résolution d'équations de Monge AmpèreJournal of Functional Analysis, 1977
- An Example of a One-Dimensional Controlled ProcessTheory of Probability and Its Applications, 1976
- Nouvelles Methodes en Contrôle ImpulsionnelApplied Mathematics & Optimization, 1975
- Deterministic and Stochastic Optimal ControlPublished by Springer Nature ,1975
- Control of a Solution of a Stochastic Integral EquationTheory of Probability and Its Applications, 1972
- An Inequality in the Theory of Stochastic IntegralsTheory of Probability and Its Applications, 1971
- Markov ProcessesPublished by Springer Nature ,1965