Stochastic control of geometric processes
- 1 March 1987
- journal article
- Published by Cambridge University Press (CUP) in Journal of Applied Probability
- Vol. 24 (1) , 97-104
- https://doi.org/10.2307/3214062
Abstract
Stochastic optimization of semimartingales which permit a dynamic description, like a stochastic differential equation, leads normally to dynamic programming procedures. The resulting Bellman equation is often of a very genera! nature, and analytically hard to solve. The models in the present paper are formulated in terms of the relative change, and the optimality criterion is to maximize the expected rate of growth. We show how this can be done in a simple way, where we avoid using the Bellman equation. An application is indicated.Keywords
This publication has 8 references indexed in Scilit:
- Ruin problems and myopic portfolio optimization in continuous tradingStochastic Processes and their Applications, 1986
- R&D projects analyzed by semimartingale methodsJournal of Applied Probability, 1985
- Optimum portfolio diversification in a general continuous-time modelStochastic Processes and their Applications, 1984
- Point Processes and QueuesPublished by Springer Nature ,1981
- Controlled Stochastic ProcessesPublished by Springer Nature ,1979
- Existence of Optimal Strategies Based on Specified Information, for a Class of Stochastic Decision ProblemsSIAM Journal on Control, 1970
- Optimal Multiperiod Portfolio PoliciesThe Journal of Business, 1968
- Investment policies for expanding businesses optimal in a long-run senseNaval Research Logistics Quarterly, 1960