Stochastic control is a very active area of research. This monograph, written by two leading authorities in the field, has been updated to reflect the latest developments. It covers effective numerical methods for stochastic control problems in continuous time on two levels, that of practice and that of mathematical development. It is broadly accessible for graduate students and researchers.
Review of Continuous Time Models.- Controlled Markov Chains.- Dynamic Programming Equations.- Markov Chain Approximation Method.- The Approximating Markov Chains.- Computational Methods.- The Ergodic Cost Problem.- Heavy Traffic and Singular Control.- Weak Convergence and the Characterization of Processes.- Convergence Proofs.- Convergence Proofs Continued.- Finite Time and Filtering Problems.- Controlled Variance and Jumps.- Problems from the Calculus of Variations: Finite Time Horizon.- Problems from the Calculus of Variations: Infinite Time Horizon.- The Viscosity Solution Approach.