Necessary Conditions for Optimal Control of Stochastic Systems with Random Jumps
- 1 September 1994
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Control and Optimization
- Vol. 32 (5) , 1447-1475
- https://doi.org/10.1137/s0363012992233858
Abstract
A maximum principle is proved for optimal controls of stochastic systems with random jumps. The control is allowed to enter into both diffusion and jump terms. The form of the maximum principle turns out to be quite different from the one corresponding to the pure diffusion system (the word "pure" here means the absence of the jump term). In calculating the first-order coefficient for the cost variation, only a property for Lebesgue integrals of scalar-valued functions in the real number space ${\Cal R}$ is used. This shows that there is no essential difference between deterministic and stochastic systems as far as the derivation of maximum principles is concerned.
Keywords
This publication has 19 references indexed in Scilit:
- The Partially Observed Stochastic Minimum PrincipleSIAM Journal on Control and Optimization, 1989
- The Maximum Principle for Optimal Control of Diffusions with Partial InformationSIAM Journal on Control and Optimization, 1987
- Lectures on stochastic controlPublished by Springer Nature ,1982
- Nonconvex minimization problemsBulletin of the American Mathematical Society, 1979
- An Introductory Approach to Duality in Optimal Stochastic ControlSIAM Review, 1978
- The Optimal Control of a Stochastic SystemSIAM Journal on Control and Optimization, 1977
- Optimal control of a jump processProbability Theory and Related Fields, 1977
- Optimal Control of Jump ProcessesSIAM Journal on Control and Optimization, 1977
- General necessary conditions for optimal control of stochastic systemsPublished by Springer Nature ,1976
- Optimal Continuous-Parameter Stochastic ControlSIAM Review, 1969