Abstract
A stochastic adaptive control system whose parameters vary according to a finite-state jump Markov process was considered earlier by P. E. Caines and H. F. Chen (IEEE Trans. Automatic Control, vol. AC-30, 1985). They recognized, by using the nonlinear filtering equations for the conditional probabilities of the parameter states, that the control problem can be converted into a completely observed control problem. They then gave a verification theorem for checking that a control is optimal. However, they did not solve any examples and it appears that there have not been any previously solved examples of this type of 'adaptive' control system. The purpose of this study is to provide an explicit solution for a linear quadratic (LQ) problem of this type. The explicit solution of a partially observed LQ-problem driven by a combination of a Wiener process and an unobserved finite-state jump Markov process is given.

This publication has 3 references indexed in Scilit: