Optimal Control of Stochastic Integrals and Hamilton–Jacobi–Bellman Equations. I

Abstract
We consider the solution of a stochastic integral control problem, and we study its regularity. In particular, we characterize the optimal cost as the maximum solution of /v V, A(v)u <=f(v) in '(),

This publication has 19 references indexed in Scilit: