stochastic optimal control
From The New Palgrave Dictionary of Economics, Second Edition, 2008
Edited by
Steven
N.
Durlauf
and
Lawrence
E.
Blume
Back to top
Back to top
Abstract
The purpose of this article is, first, to state the stochastic optimal control problem and, second, to explain how it differs from deterministic optimal control and why that difference is crucial in economic problems. The article presents intuitively the methodology of optimal stochastic control and provides an illustration from optimal stochastic economic growth as an application of this mathematical technique in economics.
Keywords
applied control; Bellman, R.; continuous time models; deterministic optimal control; discrete time models; dynamic economic models; dynamic programming; Hamilton–Jacobi–Bellman equation; principle of optimality; pure randomness; stochastic optimal control; Taylor's th; uncertainty; white noise; Wiener process
Back to top
How to cite this article
Malliaris, A.G. "stochastic optimal control." The New Palgrave Dictionary of Economics. Second Edition. Eds. Steven N. Durlauf and Lawrence E. Blume. Palgrave Macmillan, 2008. The New Palgrave Dictionary of Economics Online. Palgrave Macmillan. 18 October 2017 <http://www.dictionaryofeconomics.com/article?id=pde2008_S000269> doi:10.1057/9780230226203.1624