Adaptive control of linear systems with Markov perturbations
- 1 March 1998
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Automatic Control
- Vol. 43 (3) , 351-372
- https://doi.org/10.1109/9.661591
Abstract
The stochastic model considered is a linear jump diffusion process X for which the coefficients and the jump processes depend on a Markov chain Z with finite state space. First, we study the optimal filtering and control problem for these systems with non-Gaussian initial conditions, given noisy observations of the state X and perfect measurements of Z. We derive a new sufficient condition which ensures the existence and the uniqueness of the solution of the nonlinear stochastic differential equations satisfied by the output of the filter. We study a quadratic control problem and show that the separation principle holds. Next, we investigate an adaptive control problem for a state process X defined by a linear diffusion for which the coefficients depend on a Markov chain, the processes X and Z being observed in independent white noises. Suboptimal estimates for the process X, Z and approximate control law are investigated for a large class of probability distributions of the initial state. Asymptotic properties of these filters and this control law are obtained. Upper bounds for the corresponding error are given.Keywords
This publication has 27 references indexed in Scilit:
- Embedding adaptive JLQG into LQ martingale control with a completely observable stochastic control matrixIEEE Transactions on Automatic Control, 1996
- Exact hybrid filters in discrete timeIEEE Transactions on Automatic Control, 1996
- Stochastic Integration and Differential EquationsPublished by Springer Nature ,1990
- Filtering and Control for Point Process ObservationsPublished by Springer Nature ,1990
- Stochastic Processes in Engineering SystemsPublished by Springer Nature ,1985
- An LQ-solution to a control problem associated with a solar thermal central receiverIEEE Transactions on Automatic Control, 1983
- Double martingalesProbability Theory and Related Fields, 1976
- On a Matrix Riccati Equation of Stochastic ControlSIAM Journal on Control, 1968
- Controllability and Observability in Time-Variable Linear SystemsSIAM Journal on Control, 1967
- Some Applications of Stochastic Differential Equations to Optimal Nonlinear FilteringJournal of the Society for Industrial and Applied Mathematics Series A Control, 1964