Abstract
We study deterministic optimal control problems having stopping time, continuous and impulse controls in each strategy. We obtain the optimal cost, considered as the maximum element of a suitable set of subsolutions of the associated Hamilton-Jacobi equation, using an approximation method. A particular derivative discretization scheme is employed. Convergence of approximate solutions is shown taking advantage of a discrete maximum principle which is also proved. For the numerical solutions of approximate problems we use a method of relaxation type. The algorithm is very simple; it can be run on computers of small central memory. In Part I [SIAM J. Control Optim., 23 (1985), pp. 242–266] we studied the stationary case; in Part II we study the nonstationary case and we apply our results to a short-run model of energy production management.