A stability theorem for stochastic differential equations and application to stochastic control problems
- 1 October 1984
- journal article
- research article
- Published by Taylor & Francis in Stochastics
- Vol. 13 (4) , 257-279
- https://doi.org/10.1080/17442508408833323
Abstract
For a sequence of stochastic differential equations of the the type: a stabilty theorem is presented under appropritate convergence mode of [d] and m application to stochastic control problems is also briefly discussed.Keywords
This publication has 11 references indexed in Scilit:
- Discrete Approximation of Continuous Time Stochastic Control SystemsSIAM Journal on Control and Optimization, 1983
- On Bene?' bang-bang control problemApplied Mathematics & Optimization, 1982
- A Functional Central Limit Theorem for SemimartingalesTheory of Probability and Its Applications, 1981
- ABSOLUTE CONTINUITY AND SINGULARITY OF LOCALLY ABSOLUTELY CONTINUOUS PROBABILITY DISTRIBUTIONS. IMathematics of the USSR-Sbornik, 1979
- The Theory of Stochastic Processes IIIPublished by Springer Nature ,1979
- Stopping Times and TightnessThe Annals of Probability, 1978
- Statistics of Random Processes IPublished by Springer Nature ,1977
- Dependent Central Limit Theorems and Invariance PrinciplesThe Annals of Probability, 1974
- Weak convergence of probability measures and random functions in the function space D[0,∞)Journal of Applied Probability, 1973
- Limit Theorems for Stochastic ProcessesTheory of Probability and Its Applications, 1956