Preface
Acknowledgments
1. Stochastic Control
1. Introduction
2. Theory of Feedback Control
3. How to Characterize Disturbances
4. Stochastic Control Theory
5. Outline of the Contents of the Book
6. Bibliography and Comments
2. Stochastic Processes
1. Introduction
2. The Concept of a Stochastic Process
3. Some Special Stochastic Processes
4. The Covariance Function
5. The Concept of Spectral Density
6. Analysis of Stochastic Processes
7. Bibliography and Comments
3. Stochastic State Models
1. Introduction
2. Discrete Time Systems
3. Solution of Stochastic Difference Equations
4. Continuous Time Systems
5. Stochastic Integrals
6. Linear Stochastic Differential Equations
7. Nonlinear Stochastic Differential Equations
8. Stochastic Calculus--The Ito Differentiation Rule
9. Modeling of Physical Processes by Stochastic Differential Equations
10. Sampling a Stochastic Differential Equation
11. Bibliography and Comments
4. Analysis of Dynamical Systems Whose Inputs are Stochastic Processes
1. Introduction
2. Discrete Time Systems
3. Spectral Factorization of Discrete Time Processes
4. Analysis of Continuous Time Systems Whose Input Signals are Stochastic Processes
5. Spectral Factorization of Continuous Time Processes
6. Bibliography and Comments
5. Parametric Optimization
1. Introduction
2. Evaluation of Loss Functions for Discrete Time Systems
3. Evaluation of Loss Functions for Continuous Time Systems
4. Reconstruction of State Variables for Discrete Time Systems
5. Reconstruction of State Variables for Continuous Time Systems
6. Bibliography and Comments
6. Minimal Variance Control Strategies
1. Introduction
2. A Simple Example
3. Optimal Prediction of Discrete Time Stationary Processes
4. Minimal Variance Control Strategies
5. Sensitivity of the Optimal System
6. An Industrial Application
7. Bibliography and Comments
7. Prediction and Filtering Theory
1. Introduction
2. Formulation of Prediction and Estimation Problems
3. Preliminaries
4. State Estimation for Discrete Time Systems
5. Duality
6. State Estimation for Continuous Time Processes
7. Bibliography and Comments
8. Linear Stochastic Control Theory
1. Introduction
2. Formulation
3. Preliminaries
4. Complete State Information
5. Incomplete State Information 1
6. Incomplete State Information 2
7. Continuous Time Problems
8. Bibliography and Comments
Index
This text for upper-level undergraduates and graduate students explores stochastic control theory in terms of analysis, parametric optimization, and optimal stochastic control. Limited to linear systems with quadratic criteria, it covers discrete time as well as continuous time systems.
The first three chapters provide motivation and background material on stochastic processes, followed by an analysis of dynamical systems with inputs of stochastic processes. A simple version of the problem of optimal control of stochastic systems is discussed, along with an example of an industrial application of this theory. Subsequent discussions cover filtering and prediction theory as well as the general stochastic control problem for linear systems with quadratic criteria.
Each chapter begins with the discrete time version of a problem and progresses to a more challenging continuous time version of the same problem. Prerequisites include courses in analysis and probability theory in addition to a course in dynamical systems that covers frequency response and the state-space approach for continuous time and discrete time systems.