Ergodic Theorems for Discrete Time Stochastic Systems Using a Stochastic Lyapunov Function
- 1 November 1989
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Control and Optimization
- Vol. 27 (6) , 1409-1439
- https://doi.org/10.1137/0327073
Abstract
No abstract availableThis publication has 27 references indexed in Scilit:
- Adaptive control for time-varying systems: A combination of Martingale and Markov chain techniquesInternational Journal of Adaptive Control and Signal Processing, 1989
- A new approach to stochastic adaptive controlIEEE Transactions on Automatic Control, 1987
- Criteria for classifying general Markov chainsAdvances in Applied Probability, 1976
- A Uniform Theory for Sums of Markov Chain Transition ProbabilitiesThe Annals of Probability, 1975
- Stochastic stabilityPublished by Springer Nature ,1972
- Finite regular invariant measures for Feller processesJournal of Applied Probability, 1968
- On the input-output stability of time-varying nonlinear feedback systems Part one: Conditions derived using concepts of loop gain, conicity, and positivityIEEE Transactions on Automatic Control, 1966
- The Strong Law of Large Numbers for a Class of Markov ChainsThe Annals of Mathematical Statistics, 1960
- Control System Analysis and Design Via the “Second Method” of Lyapunov: I—Continuous-Time SystemsJournal of Basic Engineering, 1960
- On the Stochastic Matrices Associated with Certain Queuing ProcessesThe Annals of Mathematical Statistics, 1953