Asymptotic controllability implies feedback stabilization
- 1 October 1997
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Automatic Control
- Vol. 42 (10) , 1394-1407
- https://doi.org/10.1109/9.633828
Abstract
It is shown that every asymptotically controllable system can be globally stabilized by means of some (discontinuous) feedback law. The stabilizing strategy is based on pointwise optimization of a smoothed version of a control-Lyapunov function, iteratively sending trajectories into smaller and smaller neighborhoods of a desired equilibrium. A major technical problem, and one of the contributions of the present paper, concerns the precise meaning of "solution" when using a discontinuous controller.Keywords
This publication has 20 references indexed in Scilit:
- Nonsmooth control-Lyapunov functionsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Proximal Analysis and Minimization PrinciplesJournal of Mathematical Analysis and Applications, 1995
- On the Stabilization in Finite Time of Locally Controllable Systems by Means of Continuous Time-Varying Feedback LawSIAM Journal on Control and Optimization, 1995
- Qualitative properties of trajectories of control systems: A surveyJournal of Dynamical and Control Systems, 1995
- Nonlinear Control SystemsPublished by Springer Nature ,1989
- Method of Dynamic and Nonsmooth OptimizationPublished by Society for Industrial & Applied Mathematics (SIAM) ,1989
- Viscosity Solutions of Hamilton-Jacobi EquationsTransactions of the American Mathematical Society, 1983
- Remarks on continuous feedbackPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1980
- On the Synthesis of a Stabilizing Feedback Control via Lie Algebraic MethodsSIAM Journal on Control and Optimization, 1980
- Subanalytic sets and feedback controlJournal of Differential Equations, 1979