Abstract
A quasithermodynamic interpretation of the stochastic control of irreversible thermodynamic diffusion processes is presented in which the drift is the control parameter. The joint entropy, considered as a function of the initial data is the negative of the optimal expected total cost, and the generalized Hamilton–Jacobi equation, which it satisfies, is the dynamic programming equation for the optimal stochastic drift control. The two cost functions, proposed by Yasue, are shown to differ by a stochastic gauge transformation and constitute equivalent variational problems. Yasue’s results are shown to apply in two limiting cases: the asymptotic time limit, where the nonequilibrium statistical correlations have worn off, or the weak noise limit, where random thermal fluctuations have a negligible importance. The asymptotic expected total cost is governed by a minimum relative entropy principle and the goodness of the thermodynamic drift control in the stochastic problem is determined.

This publication has 9 references indexed in Scilit: