The Bellman equation for time-optimal control of noncontrollable, nonlinear systems
- 1 June 1993
- journal article
- research article
- Published by Springer Nature in Acta Applicandae Mathematicae
- Vol. 31 (3) , 201-223
- https://doi.org/10.1007/bf00997118
Abstract
No abstract availableKeywords
This publication has 26 references indexed in Scilit:
- Approximation of differential games of pursuit-evasion by discrete-time gamesPublished by Springer Nature ,2006
- Discontinuous viscosity solutions of first-order Hamilton-Jacobi equations: a guided visitNonlinear Analysis, 1993
- Approximation and regular perturbation of optimal control problems via Hamilton-Jacobi theoryApplied Mathematics & Optimization, 1991
- Hamilton-Jacobi equations with singular boundary conditions on a free boundary and applications to differential gamesTransactions of the American Mathematical Society, 1991
- An Approximation Scheme for the Minimum Time FunctionSIAM Journal on Control and Optimization, 1990
- Semicontinuous Viscosity Solutions For Hamilton–Jacobi Equations With Convex HamiltoniansCommunications in Partial Differential Equations, 1990
- A Boundary Value Problem for the Minimum-Time FunctionSIAM Journal on Control and Optimization, 1989
- Discontinuous solutions of deterministic optimal stopping time problemsESAIM: Mathematical Modelling and Numerical Analysis, 1987
- Viscosity solutions of Hamilton-Jacobi equationsTransactions of the American Mathematical Society, 1983
- Optimization—Theory and ApplicationsPublished by Springer Nature ,1983